scholarly journals Current Advances in DNA Methylation Analysis Methods

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Ehsan Khodadadi ◽  
Leila Fahmideh ◽  
Ehsaneh Khodadadi ◽  
Sounkalo Dao ◽  
Mehdi Yousefi ◽  
...  

DNA methylation is one of the epigenetic changes, which plays a major role in regulating gene expression and, thus, many biological processes and diseases. There are several methods for determining the methylation of DNA samples. However, selecting the most appropriate method for answering biological questions appears to be a challenging task. The primary methods in DNA methylation focused on identifying the state of methylation of the examined genes and determining the total amount of 5-methyl cytosine. The study of DNA methylation at a large scale of genomic levels became possible following the use of microarray hybridization technology. The new generation of sequencing platforms now allows the preparation of genomic maps of DNA methylation at the single-open level. This review includes the majority of methods available to date, introducing the most widely used methods, the bisulfite treatment, biological identification, and chemical cutting along with their advantages and disadvantages. The techniques are then scrutinized according to their robustness, high throughput capabilities, and cost.

2017 ◽  
Author(s):  
Amanda Raine ◽  
Ulrika Liljedahl ◽  
Jessica Nordlund

AbstractThe powerful HiSeq X sequencers with their patterned flowcell technology and fast turnaround times are instrumental for many large-scale genomic and epigenomic studies. However, assessment of DNA methylation by sodium bisulfite treatment results in sequencing libraries of low diversity, which may impact data quality and yield. In this report we assess the quality of WGBS data generated on the HiSeq X system in comparison with data generated on the HiSeq 2500 system and the newly released NovaSeq system. We report a systematic issue with low basecall quality scores assigned to guanines in the second read of WGBS when using certain Real Time Analysis (RTA) software versions on the HiSeq X sequencer, reminiscent of an issue that was previously reported with certain HiSeq 2500 software versions. However, with the HD.3.4.0/RTA 2.7.7 software upgrade for the HiSeq X system, we observed an overall improved quality and yield of the WGBS data generated, which in turn empowers cost-effective and high quality DNA methylation studies.


2020 ◽  
Vol 32 (2) ◽  
pp. 152
Author(s):  
E. Estrada-Cortés ◽  
P. J. Hansen

Choline is a methyl donor for DNA methylation and a precursor for phosphatidylcholine, which is the major phospholipid of cell membranes. Early embryonic development involves processes of DNA demethylation and remethylation as well as synthesis of new cell membranes. Addition of choline chloride (ChCl) to culture medium of embryos produced invitro increased birthweight of Brahman calves after embryo transfer. The objective was to determine whether the addition of ChCl to culture medium alters the pattern of DNA methylation and lipid content of pre-implantation embryos produced invitro. Embryos were incubated after fertilisation in BBH7 culture medium containing 0.0, 0.004, 1.3, or 1.8mM ChCl (adjusted with NaCl to maintain isotonicity). Concentrations were chosen to approximate free choline (0.004mM) and total choline (1.30mM) in plasma of cows at week 1 postpartum and concentration of total choline in plasma of cows fed rumen-protected choline (1.8mM). Cleavage and blastocyst rate (n=8 replicates) were evaluated at Days 3 and 7.5 post-insemination, respectively. Embryos ≥8 cells (range 8 to 24 cells; stages near embryonic genome activation) and expanded blastocysts were harvested at Day 3.75 (n=232) and 7.5 (n=204) to estimate global DNA methylation by immunostaining for 5-methyl-cytosine. Methylation of DNA was estimated by calculating the ratio of fluorescence for 5-methyl-cytosine to that of propidium iodide (DNA). Another group of expanded blastocysts was harvested (n=99) to estimate lipid content using Nile Red. Embryo development was analysed by GLIMMIX procedure and fluorescence data by GLM procedure of SAS (SAS Institute Inc.). The proportion of zygotes that cleaved after fertilisation (77.5±2.3, 78.1±2.3, 74.5±2.4, and 80.1±2.2% for 0.0, 0.004, 1.3, and 1.8mM ChCl; P=0.2736) and cleaved embryos that became blastocysts (37.8±4.4, 41.5±4.6, 42.8±4.6, and 39.6±4.4%; P=0.5764) was similar between treatments. The DNA methylation at both days was affected by treatment (P<0.001). At Day 3.75, 1.3mM choline reduced methylation and there were no effects of other concentrations (1.13a±0.03, 1.04a±0.03, 0.92b±0.03, and 1.13a±0.04; means with different superscripts differ at P<0.05). For blastocysts, in contrast, DNA methylation was increased for embryos treated with 1.3 and 1.8mM choline (0.98a±0.04, 1.04ab±0.03, 1.25c±0.03, and 1.11b±0.03). Lipid content in blastocysts was also affected by treatment (P=0.0139). In particular, lipid content was higher for embryos treated with 1.3 and 1.8mM choline (409.1a±54.3, 542.3ab±62.3, 651.3b±54.3, and 583.9b±55.0). In conclusion, addition of ChCl to culture medium altered DNA methylation in bovine pre-implantation embryos produced invitro in a manner dependent on developmental stage and choline concentration. Likewise, ChCl increased lipid content in the resultant blastocysts. Support was provided by the Red Larson Endowment.


2020 ◽  
Author(s):  
Lungwani Muungo

The purpose of this review is to evaluate progress inmolecular epidemiology over the past 24 years in canceretiology and prevention to draw lessons for futureresearch incorporating the new generation of biomarkers.Molecular epidemiology was introduced inthe study of cancer in the early 1980s, with theexpectation that it would help overcome some majorlimitations of epidemiology and facilitate cancerprevention. The expectation was that biomarkerswould improve exposure assessment, document earlychanges preceding disease, and identify subgroupsin the population with greater susceptibility to cancer,thereby increasing the ability of epidemiologic studiesto identify causes and elucidate mechanisms incarcinogenesis. The first generation of biomarkers hasindeed contributed to our understanding of riskandsusceptibility related largely to genotoxic carcinogens.Consequently, interventions and policy changes havebeen mounted to reduce riskfrom several importantenvironmental carcinogens. Several new and promisingbiomarkers are now becoming available for epidemiologicstudies, thanks to the development of highthroughputtechnologies and theoretical advances inbiology. These include toxicogenomics, alterations ingene methylation and gene expression, proteomics, andmetabonomics, which allow large-scale studies, includingdiscovery-oriented as well as hypothesis-testinginvestigations. However, most of these newer biomarkershave not been adequately validated, and theirrole in the causal paradigm is not clear. There is a needfor their systematic validation using principles andcriteria established over the past several decades inmolecular cancer epidemiology.


Author(s):  
Stefano Vassanelli

Establishing direct communication with the brain through physical interfaces is a fundamental strategy to investigate brain function. Starting with the patch-clamp technique in the seventies, neuroscience has moved from detailed characterization of ionic channels to the analysis of single neurons and, more recently, microcircuits in brain neuronal networks. Development of new biohybrid probes with electrodes for recording and stimulating neurons in the living animal is a natural consequence of this trend. The recent introduction of optogenetic stimulation and advanced high-resolution large-scale electrical recording approaches demonstrates this need. Brain implants for real-time neurophysiology are also opening new avenues for neuroprosthetics to restore brain function after injury or in neurological disorders. This chapter provides an overview on existing and emergent neurophysiology technologies with particular focus on those intended to interface neuronal microcircuits in vivo. Chemical, electrical, and optogenetic-based interfaces are presented, with an analysis of advantages and disadvantages of the different technical approaches.


Pharmaceutics ◽  
2021 ◽  
Vol 13 (2) ◽  
pp. 140
Author(s):  
Abdellatif Bouazzaoui ◽  
Ahmed A. H. Abdellatif ◽  
Faisal A. Al-Allaf ◽  
Neda M. Bogari ◽  
Saied Al-Dehlawi ◽  
...  

The current COVID-19 pandemic, caused by severe acute respiratory syndrome-coronavirus 2 (SARS-CoV-2), has raised significant economic, social, and psychological concerns. The rapid spread of the virus, coupled with the absence of vaccines and antiviral treatments for SARS-CoV-2, has galvanized a major global endeavor to develop effective vaccines. Within a matter of just a few months of the initial outbreak, research teams worldwide, adopting a range of different strategies, embarked on a quest to develop effective vaccine that could be effectively used to suppress this virulent pathogen. In this review, we describe conventional approaches to vaccine development, including strategies employing proteins, peptides, and attenuated or inactivated pathogens in combination with adjuvants (including genetic adjuvants). We also present details of the novel strategies that were adopted by different research groups to successfully transfer recombinantly expressed antigens while using viral vectors (adenoviral and retroviral) and non-viral delivery systems, and how recently developed methods have been applied in order to produce vaccines that are based on mRNA, self-amplifying RNA (saRNA), and trans-amplifying RNA (taRNA). Moreover, we discuss the methods that are being used to enhance mRNA stability and protein production, the advantages and disadvantages of different methods, and the challenges that are encountered during the development of effective vaccines.


2020 ◽  
Vol 22 (Supplement_2) ◽  
pp. ii75-ii75
Author(s):  
Thais Sabedot ◽  
Michael Wells ◽  
Indrani Datta ◽  
Tathiane Malta ◽  
Ana Valeria Castro ◽  
...  

Abstract Adult diffuse gliomas are central nervous system (CNS) tumors that arise from the malignant transformation of glial cells. Nearly all gliomas will recur despite standard treatment however, current histopathological grading fails to predict which of them will relapse and/or progress. The Glioma Longitudinal AnalySiS (GLASS) consortium is a large-scale collaboration that aims to investigate the molecular profiling of matched primary and recurrent glioma samples from multiple institutions in order to better understand the dynamic evolution of these tumors. At this time, the cohort comprises 946 samples across 11 institutions and among those, 864 have DNA methylation data available. The current molecular classification based on 7 subtypes published by TCGA in 2016 was applied to the dataset. Among the IDH wildtype tumors, 33% (16/49) of the patients showed a change of subtype upon recurrence, whereas most of them (9/16) were Classic-like at the primary stage but changed to either Mesenchymal-like or PA-like at the recurrent level. Among the IDH mutant tumors, 15% (22/142) showed a change of subtype at recurrent stage, in which 16 out of 22 progressed from G-CIMP-high to G-CIMP-low. Although some tumors progressed to a different subtype upon recurrence, an unsupervised analysis showed that the samples tend to cluster by patient instead of by subtype. By estimating the copy number alterations of these tumors using DNA methylation, the overall copy number profile of the recurrent samples remains similar to their primary counterpart. From this initial analysis using epigenomic data, we were able to characterize some aspects of glioma evolution and how the DNA methylation is associated with the progression of these tumors to different subtypes. These findings corroborate the importance of epigenetics in gliomas and can potentially lead to the identification of new biomarkers that can reflect tumor burden and predict its development.


2021 ◽  
pp. 1-6
Author(s):  
Ben Kang ◽  
Hyun Seok Lee ◽  
Seong Woo Jeon ◽  
Soo Yeun Park ◽  
Gyu Seog Choi ◽  
...  

BACKGROUND: Colorectal cancer (CRC) is one of the leading causes of mortality and morbidity in the world. It is characterized by different pathways of carcinogenesis and is a heterogeneous disease with diverse molecular landscapes that reflect histopathological and clinical information. Changes in the DNA methylation status of colon epithelial cells have been identified as critical components in CRC development and appear to be emerging biomarkers for the early detection and prognosis of CRC. OBJECTIVE: To explore the underlying disease mechanisms and identify more effective biomarkers of CRC. METHODS: We compared the levels and frequencies of DNA methylation in 11 genes (Alu, APC, DAPK, MGMT, MLH1, MINT1, MINT2, MINT3, p16, RGS6, and TFPI2) in colorectal cancer and its precursor adenomatous polyp with normal tissue of healthy subjects using pyrosequencing and then evaluated the clinical value of these genes. RESULTS: Aberrant methylation of Alu, MGMT, MINT2, and TFPI2 genes was progressively accumulated during the normal-adenoma-carcinoma progression. Additionally, CGI methylation occurred either as an adenoma-associated event for APC, MLH1, MINT1, MINT31, p16, and RGS6 or a tumor-associated event for DAPK. Moreover, relatively high levels and frequencies of DAPK, MGMT, and TFPI2 methylation were detected in the peritumoral nonmalignant mucosa of cancer patients in a field-cancerization manner, as compared to normal mucosa from healthy subjects. CONCLUSION: This study identified several biomarkers associated with the initiation and progression of CRC. As novel findings, they may have important clinical implications for CRC diagnostic and prognostic applications. Further large-scale studies are needed to confirm these findings.


Author(s):  
Clemens M. Lechner ◽  
Nivedita Bhaktha ◽  
Katharina Groskurth ◽  
Matthias Bluemke

AbstractMeasures of cognitive or socio-emotional skills from large-scale assessments surveys (LSAS) are often based on advanced statistical models and scoring techniques unfamiliar to applied researchers. Consequently, applied researchers working with data from LSAS may be uncertain about the assumptions and computational details of these statistical models and scoring techniques and about how to best incorporate the resulting skill measures in secondary analyses. The present paper is intended as a primer for applied researchers. After a brief introduction to the key properties of skill assessments, we give an overview over the three principal methods with which secondary analysts can incorporate skill measures from LSAS in their analyses: (1) as test scores (i.e., point estimates of individual ability), (2) through structural equation modeling (SEM), and (3) in the form of plausible values (PVs). We discuss the advantages and disadvantages of each method based on three criteria: fallibility (i.e., control for measurement error and unbiasedness), usability (i.e., ease of use in secondary analyses), and immutability (i.e., consistency of test scores, PVs, or measurement model parameters across different analyses and analysts). We show that although none of the methods are optimal under all criteria, methods that result in a single point estimate of each respondent’s ability (i.e., all types of “test scores”) are rarely optimal for research purposes. Instead, approaches that avoid or correct for measurement error—especially PV methodology—stand out as the method of choice. We conclude with practical recommendations for secondary analysts and data-producing organizations.


npj Vaccines ◽  
2021 ◽  
Vol 6 (1) ◽  
Author(s):  
Nikolaos C. Kyriakidis ◽  
Andrés López-Cortés ◽  
Eduardo Vásconez González ◽  
Alejandra Barreto Grimaldos ◽  
Esteban Ortiz Prado

AbstractThe new SARS-CoV-2 virus is an RNA virus that belongs to the Coronaviridae family and causes COVID-19 disease. The newly sequenced virus appears to originate in China and rapidly spread throughout the world, becoming a pandemic that, until January 5th, 2021, has caused more than 1,866,000 deaths. Hence, laboratories worldwide are developing an effective vaccine against this disease, which will be essential to reduce morbidity and mortality. Currently, there more than 64 vaccine candidates, most of them aiming to induce neutralizing antibodies against the spike protein (S). These antibodies will prevent uptake through the human ACE-2 receptor, thereby limiting viral entrance. Different vaccine platforms are being used for vaccine development, each one presenting several advantages and disadvantages. Thus far, thirteen vaccine candidates are being tested in Phase 3 clinical trials; therefore, it is closer to receiving approval or authorization for large-scale immunizations.


2014 ◽  
Vol 26 (4) ◽  
pp. 781-817 ◽  
Author(s):  
Ching-Pei Lee ◽  
Chih-Jen Lin

Linear rankSVM is one of the widely used methods for learning to rank. Although its performance may be inferior to nonlinear methods such as kernel rankSVM and gradient boosting decision trees, linear rankSVM is useful to quickly produce a baseline model. Furthermore, following its recent development for classification, linear rankSVM may give competitive performance for large and sparse data. A great deal of works have studied linear rankSVM. The focus is on the computational efficiency when the number of preference pairs is large. In this letter, we systematically study existing works, discuss their advantages and disadvantages, and propose an efficient algorithm. We discuss different implementation issues and extensions with detailed experiments. Finally, we develop a robust linear rankSVM tool for public use.


Sign in / Sign up

Export Citation Format

Share Document