scholarly journals Anisotropic Correction of Beam-induced Motion for Improved Single-particle Electron Cryo-microscopy

2016 ◽  
Author(s):  
Shawn Q. Zheng ◽  
Eugene Palovcak ◽  
Jean-Paul Armache ◽  
Yifan Cheng ◽  
David A. Agard

AbstractCorrection of electron beam-induced sample motion is one of the major factors contributing to the recent resolution breakthroughs in cryo-electron microscopy. Improving the accuracy and efficiency of motion correction can lead to further resolution improvement. Based on observations that the electron beam induces doming of the thin vitreous ice layer, we developed an algorithm to correct anisotropic image motion at the single pixel level across the whole frame, suitable for both single particle and tomographic images. Iterative, patch-based motion detection is combined with spatial and temporal constraints and dose weighting. The multi-GPU accelerated program, MotionCor2, is sufficiently fast to keep up with automated data collection. The result is an exceptionally robust strategy that can work on a wide range of data sets, including those very close to focus or with very short integration times, obviating the need for particle polishing. Application significantly improves Thon ring quality and 3D reconstruction resolution.

IUCrJ ◽  
2019 ◽  
Vol 6 (1) ◽  
pp. 5-17 ◽  
Author(s):  
Jasenko Zivanov ◽  
Takanori Nakane ◽  
Sjors H. W. Scheres

A new method to estimate the trajectories of particle motion and the amount of cumulative beam damage in electron cryo-microscopy (cryo-EM) single-particle analysis is presented. The motion within the sample is modelled through the use of Gaussian process regression. This allows a prior likelihood that favours spatially and temporally smooth motion to be associated with each hypothetical set of particle trajectories without imposing hard constraints. This formulation enables the a posteriori likelihood of a set of particle trajectories to be expressed as a product of that prior likelihood and an observation likelihood given by the data, and this a posteriori likelihood to then be maximized. Since the smoothness prior requires three parameters that describe the statistics of the observed motion, an efficient stochastic method to estimate these parameters is also proposed. Finally, a practical algorithm is proposed that estimates the average amount of cumulative radiation damage as a function of radiation dose and spatial frequency, and then fits relative B factors to that damage in a robust way. The method is evaluated on three publicly available data sets, and its usefulness is illustrated by comparison with state-of-the-art methods and previously published results. The new method has been implemented as Bayesian polishing in RELION-3, where it replaces the existing particle-polishing method, as it outperforms the latter in all tests conducted.


IUCrJ ◽  
2019 ◽  
Vol 6 (6) ◽  
pp. 1099-1105 ◽  
Author(s):  
Olivia Pfeil-Gardiner ◽  
Deryck J. Mills ◽  
Janet Vonck ◽  
Werner Kuehlbrandt

Radiation damage is the most fundamental limitation for achieving high resolution in electron cryo-microscopy (cryo-EM) of biological samples. The effects of radiation damage are reduced by liquid-helium cooling, although the use of liquid helium is more challenging than that of liquid nitrogen. To date, the benefits of liquid-nitrogen and liquid-helium cooling for single-particle cryo-EM have not been compared quantitatively. With recent technical and computational advances in cryo-EM image recording and processing, such a comparison now seems timely. This study aims to evaluate the relative merits of liquid-helium cooling in present-day single-particle analysis, taking advantage of direct electron detectors. Two data sets for recombinant mouse heavy-chain apoferritin cooled with liquid-nitrogen or liquid-helium to 85 or 17 K were collected, processed and compared. No improvement in terms of resolution or Coulomb potential map quality was found for liquid-helium cooling. Interestingly, beam-induced motion was found to be significantly higher with liquid-helium cooling, especially within the most valuable first few frames of an exposure, thus counteracting any potential benefit of better cryoprotection that liquid-helium cooling may offer for single-particle cryo-EM.


Author(s):  
Y. Kokubo ◽  
W. H. Hardy ◽  
J. Dance ◽  
K. Jones

A color coded digital image processing is accomplished by using JEM100CX TEM SCAN and ORTEC’s LSI-11 computer based multi-channel analyzer (EEDS-II-System III) for image analysis and display. Color coding of the recorded image enables enhanced visualization of the image using mathematical techniques such as compression, gray scale expansion, gamma-processing, filtering, etc., without subjecting the sample to further electron beam irradiation once images have been stored in the memory.The powerful combination between a scanning electron microscope and computer is starting to be widely used 1) - 4) for the purpose of image processing and particle analysis. Especially, in scanning electron microscopy it is possible to get all information resulting from the interactions between the electron beam and specimen materials, by using different detectors for signals such as secondary electron, backscattered electrons, elastic scattered electrons, inelastic scattered electrons, un-scattered electrons, X-rays, etc., each of which contains specific information arising from their physical origin, study of a wide range of effects becomes possible.


Author(s):  
Theodoros Tsoulos ◽  
Supriya Atta ◽  
Maureen Lagos ◽  
Michael Beetz ◽  
Philip Batson ◽  
...  

<div>Gold nanostars display exceptional field enhancement properties and tunable resonant modes that can be leveraged to create effective imaging tags or phototherapeutic agents, or to design novel hot-electron based photocatalysts. From a fundamental standpoint, they represent important tunable platforms to study the dependence of hot carrier energy and dynamics on plasmon band intensity and position. Toward the realization of these platforms, holistic approaches taking into account both theory and experiments to study the fundamental behavior of these</div><div>particles are needed. Arguably, the intrinsic difficulties underlying this goal stem from the inability to rationally design and effectively synthesize nanoparticles that are sufficiently monodispersed to be employed for corroborations of the theoretical results without the need of single particle experiments. Herein, we report on our concerted computational and experimental effort to design, synthesize, and explain the origin and morphology-dependence of the plasmon modes of a novel gold nanostar system, with an approach that builds upon the well-known plasmon hybridization model. We have synthesized monodispersed samples of gold nanostars with finely tunable morphology employing seed-mediated colloidal protocols, and experimentally observed narrow and spectrally resolved harmonics of the primary surface plasmon resonance mode both at the single particle level (via electron energy loss spectroscopy) and in ensemble (by UV-Vis and ATR-FTIR spectroscopies). Computational results on complex anisotropic gold nanostructures are validated experimentally on samples prepared colloidally, underscoring their importance as ideal testbeds for the study of structure-property relationships in colloidal nanostructures of high structural complexity.</div>


2020 ◽  
Vol 10 (3) ◽  
pp. 169-184
Author(s):  
Rachna Anand ◽  
Arun Kumar ◽  
Arun Nanda

Background: Solubility and dissolution profile are the major factors which directly affect the biological activity of a drug and these factors are governed by the physicochemical properties of the drug. Crystal engineering is a newer and promising approach to improve physicochemical characteristics of a drug without any change in its pharmacological action through a selection of a wide range of easily available crystal formers. Objective: The goal of this review is to summarize the importance of crystal engineering in improving the physicochemical properties of a drug, methods of design, development, and applications of cocrystals along with future trends in research of pharmaceutical co-crystals. Co-crystallization can also be carried out for the molecules which lack ionizable functional groups, unlike salts which require ionizable groups. Conclusion: Co-crystals is an interesting and promising research area amongst pharmaceutical scientists to fine-tune the physicochemical properties of drug materials. Co-crystallization can be a tool to increase the lifecycle of an older drug molecule. Crystal engineering carries the potential of being an advantageous technique than any other approach used in the pharmaceutical industry. Crystal engineering offers a plethora of biopharmaceutical and physicochemical enhancements to a drug molecule without the need of any pharmacological change in the drug.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Eleanor F. Miller ◽  
Andrea Manica

Abstract Background Today an unprecedented amount of genetic sequence data is stored in publicly available repositories. For decades now, mitochondrial DNA (mtDNA) has been the workhorse of genetic studies, and as a result, there is a large volume of mtDNA data available in these repositories for a wide range of species. Indeed, whilst whole genome sequencing is an exciting prospect for the future, for most non-model organisms’ classical markers such as mtDNA remain widely used. By compiling existing data from multiple original studies, it is possible to build powerful new datasets capable of exploring many questions in ecology, evolution and conservation biology. One key question that these data can help inform is what happened in a species’ demographic past. However, compiling data in this manner is not trivial, there are many complexities associated with data extraction, data quality and data handling. Results Here we present the mtDNAcombine package, a collection of tools developed to manage some of the major decisions associated with handling multi-study sequence data with a particular focus on preparing sequence data for Bayesian skyline plot demographic reconstructions. Conclusions There is now more genetic information available than ever before and large meta-data sets offer great opportunities to explore new and exciting avenues of research. However, compiling multi-study datasets still remains a technically challenging prospect. The mtDNAcombine package provides a pipeline to streamline the process of downloading, curating, and analysing sequence data, guiding the process of compiling data sets from the online database GenBank.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Géraldine Fauville ◽  
Anna C. M. Queiroz ◽  
Erika S. Woolsey ◽  
Jonathan W. Kelly ◽  
Jeremy N. Bailenson

AbstractResearch about vection (illusory self-motion) has investigated a wide range of sensory cues and employed various methods and equipment, including use of virtual reality (VR). However, there is currently no research in the field of vection on the impact of floating in water while experiencing VR. Aquatic immersion presents a new and interesting method to potentially enhance vection by reducing conflicting sensory information that is usually experienced when standing or sitting on a stable surface. This study compares vection, visually induced motion sickness, and presence among participants experiencing VR while standing on the ground or floating in water. Results show that vection was significantly enhanced for the participants in the Water condition, whose judgments of self-displacement were larger than those of participants in the Ground condition. No differences in visually induced motion sickness or presence were found between conditions. We discuss the implication of this new type of VR experience for the fields of VR and vection while also discussing future research questions that emerge from our findings.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Yance Feng ◽  
Lei M. Li

Abstract Background Normalization of RNA-seq data aims at identifying biological expression differentiation between samples by removing the effects of unwanted confounding factors. Explicitly or implicitly, the justification of normalization requires a set of housekeeping genes. However, the existence of housekeeping genes common for a very large collection of samples, especially under a wide range of conditions, is questionable. Results We propose to carry out pairwise normalization with respect to multiple references, selected from representative samples. Then the pairwise intermediates are integrated based on a linear model that adjusts the reference effects. Motivated by the notion of housekeeping genes and their statistical counterparts, we adopt the robust least trimmed squares regression in pairwise normalization. The proposed method (MUREN) is compared with other existing tools on some standard data sets. The goodness of normalization emphasizes on preserving possible asymmetric differentiation, whose biological significance is exemplified by a single cell data of cell cycle. MUREN is implemented as an R package. The code under license GPL-3 is available on the github platform: github.com/hippo-yf/MUREN and on the conda platform: anaconda.org/hippo-yf/r-muren. Conclusions MUREN performs the RNA-seq normalization using a two-step statistical regression induced from a general principle. We propose that the densities of pairwise differentiations are used to evaluate the goodness of normalization. MUREN adjusts the mode of differentiation toward zero while preserving the skewness due to biological asymmetric differentiation. Moreover, by robustly integrating pre-normalized counts with respect to multiple references, MUREN is immune to individual outlier samples.


Sensors ◽  
2021 ◽  
Vol 21 (10) ◽  
pp. 3406
Author(s):  
Jie Jiang ◽  
Yin Zou ◽  
Lidong Chen ◽  
Yujie Fang

Precise localization and pose estimation in indoor environments are commonly employed in a wide range of applications, including robotics, augmented reality, and navigation and positioning services. Such applications can be solved via visual-based localization using a pre-built 3D model. The increase in searching space associated with large scenes can be overcome by retrieving images in advance and subsequently estimating the pose. The majority of current deep learning-based image retrieval methods require labeled data, which increase data annotation costs and complicate the acquisition of data. In this paper, we propose an unsupervised hierarchical indoor localization framework that integrates an unsupervised network variational autoencoder (VAE) with a visual-based Structure-from-Motion (SfM) approach in order to extract global and local features. During the localization process, global features are applied for the image retrieval at the level of the scene map in order to obtain candidate images, and are subsequently used to estimate the pose from 2D-3D matches between query and candidate images. RGB images only are used as the input of the proposed localization system, which is both convenient and challenging. Experimental results reveal that the proposed method can localize images within 0.16 m and 4° in the 7-Scenes data sets and 32.8% within 5 m and 20° in the Baidu data set. Furthermore, our proposed method achieves a higher precision compared to advanced methods.


Sign in / Sign up

Export Citation Format

Share Document