STEM:  A Software Tool for Large-Scale Proteomic Data Analyses

2005 ◽  
Vol 4 (5) ◽  
pp. 1826-1831 ◽  
Author(s):  
Takashi Shinkawa ◽  
Masato Taoka ◽  
Yoshio Yamauchi ◽  
Tohru Ichimura ◽  
Hiroyuki Kaji ◽  
...  
2021 ◽  
Vol 1 (2) ◽  
Author(s):  
Alexander Ostrovsky ◽  
Jennifer Hillman‐Jackson ◽  
Dave Bouvier ◽  
Dave Clements ◽  
Enis Afgan ◽  
...  

2009 ◽  
Vol 25 (5) ◽  
pp. 662-663 ◽  
Author(s):  
Olivier Martin ◽  
Armand Valsesia ◽  
Amalio Telenti ◽  
Ioannis Xenarios ◽  
Brian J. Stevenson

2019 ◽  
Author(s):  
Eduard Klapwijk ◽  
Wouter van den Bos ◽  
Christian K. Tamnes ◽  
Nora Maria Raschle ◽  
Kathryn L. Mills

Many workflows and tools that aim to increase the reproducibility and replicability of research findings have been suggested. In this review, we discuss the opportunities that these efforts offer for the field of developmental cognitive neuroscience, in particular developmental neuroimaging. We focus on issues broadly related to statistical power and to flexibility and transparency in data analyses. Critical considerations relating to statistical power include challenges in recruitment and testing of young populations, how to increase the value of studies with small samples, and the opportunities and challenges related to working with large-scale datasets. Developmental studies involve challenges such as choices about age groupings, lifespan modelling, analyses of longitudinal changes, and data that can be processed and analyzed in a multitude of ways. Flexibility in data acquisition, analyses and description may thereby greatly impact results. We discuss methods for improving transparency in developmental neuroimaging, and how preregistration can improve methodological rigor. While outlining challenges and issues that may arise before, during, and after data collection, solutions and resources are highlighted aiding to overcome some of these. Since the number of useful tools and techniques is ever-growing, we highlight the fact that many practices can be implemented stepwise.


2021 ◽  
Author(s):  
Weiqian Cao ◽  
Siyuan Kong ◽  
Wenfeng Zeng ◽  
Pengyun Gong ◽  
Biyun Jiang ◽  
...  

Interpreting large-scale glycoproteomic data for intact glycopeptide identification has been tremendously advanced by software tools. However, software tools for quantitative analysis of intact glycopeptides remain lagging behind, which greatly hinders exploring the differential expression and functions of site-specific glycosylation in organisms. Here, we report pGlycoQuant, a generic software tool for accurate and convenient quantitative intact glycopeptide analysis, supporting both primary and tandem mass spectrometry quantitation for multiple quantitative strategies. pGlycoQuant enables intact glycopeptide quantitation with very low missing values via a deep residual network, thus greatly expanding the quantitative function of several powerful search engines, currently including pGlyco 2.0, pGlyco3, Byonic and MSFragger-Glyco. The pGlycoQuant-based site-specific N-glycoproteomic study conducted here quantifies 6435 intact N-glycopeptides in three hepatocellular carcinoma cell lines with different metastatic potentials and, together with in vitro molecular biology experiments, illustrates core fucosylation at site 979 of the L1 cell adhesion molecule (L1CAM) as a potential regulator of HCC metastasis. pGlycoQuant is freely available at https://github.com/expellir-arma/pGlycoQuant/releases/. We have demonstrated pGlycoQuant to be a powerful tool for the quantitative analysis of site-specific glycosylation and the exploration of potential glycosylation-related biomarker candidates, and we expect further applications in glycoproteomic studies.


2018 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Roland Erwin Suri ◽  
Mohamed El-Saad

PurposeChanges in file format specifications challenge long-term preservation of digital documents. Digital archives thus often focus on specific file formats that are well suited for long-term preservation, such as the PDF/A format. Since only few customers submit PDF/A files, digital archives may consider converting submitted files to the PDF/A format. The paper aims to discuss these issues.Design/methodology/approachThe authors evaluated three software tools for batch conversion of common file formats to PDF/A-1b: LuraTech PDF Compressor, Adobe Acrobat XI Pro and 3-HeightsTMDocument Converter by PDF Tools. The test set consisted of 80 files, with 10 files each of the eight file types JPEG, MS PowerPoint, PDF, PNG, MS Word, MS Excel, MSG and “web page.”FindingsBatch processing was sometimes hindered by stops that required manual interference. Depending on the software tool, three to four of these stops occurred during batch processing of the 80 test files. Furthermore, the conversion tools sometimes failed to produce output files even for supported file formats: three (Adobe Pro) up to seven (LuraTech and 3-HeightsTM) PDF/A-1b files were not produced. Since Adobe Pro does not convert e-mails, a total of 213 PDF/A-1b files were produced. The faithfulness of each conversion was investigated by comparing the visual appearance of the input document with that of the produced PDF/A-1b document on a computer screen. Meticulous visual inspection revealed that the conversion to PDF/A-1b impaired the information content in 24 of the converted 213 files (11 percent). These reproducibility errors included loss of links, loss of other document content (unreadable characters, missing text, document part missing), updated fields (reflecting time and folder of conversion), vector graphics issues and spelling errors.Originality/valueThese results indicate that large-scale batch conversions of heterogeneous files to PDF/A-1b cause complex issues that need to be addressed for each individual file. Even with considerable efforts, some information loss seems unavoidable if large numbers of files from heterogeneous sources are migrated to the PDF/A-1b format.


Energies ◽  
2020 ◽  
Vol 13 (3) ◽  
pp. 541 ◽  
Author(s):  
Sourav Khanna ◽  
Victor Becerra ◽  
Adib Allahham ◽  
Damian Giaouris ◽  
Jamie M. Foster ◽  
...  

Residential variable energy price schemes can be made more effective with the use of a demand response (DR) strategy along with smart appliances. Using DR, the electricity bill of participating customers/households can be minimised, while pursuing other aims such as demand-shifting and maximising consumption of locally generated renewable-electricity. In this article, a two-stage optimization method is used to implement a price-based implicit DR scheme. The model considers a range of novel smart devices/technologies/schemes, connected to smart-meters and a local DR-Controller. A case study with various decarbonisation scenarios is used to analyse the effects of deploying the proposed DR-scheme in households located in the west area of the Isle of Wight (Southern United Kingdom). There are approximately 15,000 households, of which 3000 are not connected to the gas-network. Using a distribution network model along with a load flow software-tool, the secondary voltages and apparent-power through transformers at the relevant substations are computed. The results show that in summer, participating households could export up to 6.4 MW of power, which is 10% of installed large-scale photovoltaics (PV) capacity on the island. Average carbon dioxide equivalent (CO2e) reductions of 7.1 ktons/annum and a reduction in combined energy/transport fuel-bills of 60%/annum could be achieved by participating households.


2006 ◽  
Vol 63 (5) ◽  
pp. 1377-1389 ◽  
Author(s):  
Tim Li ◽  
Bing Fu

Abstract The structure and evolution characteristics of Rossby wave trains induced by tropical cyclone (TC) energy dispersion are revealed based on the Quick Scatterometer (QuikSCAT) and Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) data. Among 34 cyclogenesis cases analyzed in the western North Pacific during 2000–01 typhoon seasons, six cases are associated with the Rossby wave energy dispersion of a preexisting TC. The wave trains are oriented in a northwest–southeast direction, with alternating cyclonic and anticyclonic vorticity circulation. A typical wavelength of the wave train is about 2500 km. The TC genesis is observed in the cyclonic circulation region of the wave train, possibly through a scale contraction process. The satellite data analyses reveal that not all TCs have a Rossby wave train in their wakes. The occurrence of the Rossby wave train depends to a certain extent on the TC intensity and the background flow. Whether or not a Rossby wave train can finally lead to cyclogenesis depends on large-scale dynamic and thermodynamic conditions related to both the change of the seasonal mean state and the phase of the tropical intraseasonal oscillation. Stronger low-level convergence and cyclonic vorticity, weaker vertical shear, and greater midtropospheric moisture are among the favorable large-scale conditions. The rebuilding process of a conditional unstable stratification is important in regulating the frequency of TC genesis.


2011 ◽  
Vol 12 (1) ◽  
pp. 34 ◽  
Author(s):  
Tania Dottorini ◽  
Nicola Senin ◽  
Giorgio Mazzoleni ◽  
Kalle Magnusson ◽  
Andrea Crisanti

Sign in / Sign up

Export Citation Format

Share Document