scholarly journals Modeling and enhanced sampling of molecular systems with smooth and nonlinear data-driven collective variables

2013 ◽  
Vol 139 (21) ◽  
pp. 214101 ◽  
Author(s):  
Behrooz Hashemian ◽  
Daniel Millán ◽  
Marino Arroyo
2020 ◽  
Vol 11 (8) ◽  
pp. 2998-3004 ◽  
Author(s):  
Luigi Bonati ◽  
Valerio Rizzi ◽  
Michele Parrinello

2016 ◽  
Vol 113 (11) ◽  
pp. 2839-2844 ◽  
Author(s):  
Pratyush Tiwary ◽  
B. J. Berne

In modern-day simulations of many-body systems, much of the computational complexity is shifted to the identification of slowly changing molecular order parameters called collective variables (CVs) or reaction coordinates. A vast array of enhanced-sampling methods are based on the identification and biasing of these low-dimensional order parameters, whose fluctuations are important in driving rare events of interest. Here, we describe a new algorithm for finding optimal low-dimensional CVs for use in enhanced-sampling biasing methods like umbrella sampling, metadynamics, and related methods, when limited prior static and dynamic information is known about the system, and a much larger set of candidate CVs is specified. The algorithm involves estimating the best combination of these candidate CVs, as quantified by a maximum path entropy estimate of the spectral gap for dynamics viewed as a function of that CV. The algorithm is called spectral gap optimization of order parameters (SGOOP). Through multiple practical examples, we show how this postprocessing procedure can lead to optimization of CV and several orders of magnitude improvement in the convergence of the free energy calculated through metadynamics, essentially giving the ability to extract useful information even from unsuccessful metadynamics runs.


Author(s):  
Xiaoyong Cao ◽  
Pu Tian

Molecular modeling is widely utilized in subjects including but not limited to physics, chemistry, biology, materials science and engineering. Impressive progress has been made in development of theories, algorithms and software packages. To divide and conquer, and to cache intermediate results have been long standing principles in development of algorithms. Not surprisingly, Most of important methodological advancements in more than half century of molecule modeling are various implementations of these two fundamental principles. In the mainstream classical computational molecular science based on force fields parameterization by coarse graining, tremendous efforts have been invested on two lines of algorithm development. The first is coarse graining, which is to represent multiple basic particles in higher resolution modeling as a single larger and softer particle in lower resolution counterpart, with resulting force fields of partial transferability at the expense of some information loss. The second is enhanced sampling, which realizes "dividing and conquering" and/or "caching" in configurational space with focus either on reaction coordinates and collective variables as in metadynamics and related algorithms, or on the transition matrix and state discretization as in Markov state models. For this line of algorithms, spatial resolution is maintained but no transferability is available. Deep learning has been utilized to realize more efficient and accurate ways of "dividing and conquering" and "caching" along these two lines of algorithmic research. We proposed and demonstrated the local free energy landscape approach, a new framework for classical computational molecular science and a third class of algorithm that facilitates molecular modeling through partially transferable in resolution "caching" of distributions for local clusters of molecular degrees of freedom. Differences, connections and potential interactions among these three algorithmic directions are discussed, with the hope to stimulate development of more elegant, efficient and reliable formulations and algorithms for "dividing and conquering" and "caching" in complex molecular systems.


2019 ◽  
Vol 116 (36) ◽  
pp. 17641-17647 ◽  
Author(s):  
Luigi Bonati ◽  
Yue-Yu Zhang ◽  
Michele Parrinello

Sampling complex free-energy surfaces is one of the main challenges of modern atomistic simulation methods. The presence of kinetic bottlenecks in such surfaces often renders a direct approach useless. A popular strategy is to identify a small number of key collective variables and to introduce a bias potential that is able to favor their fluctuations in order to accelerate sampling. Here, we propose to use machine-learning techniques in conjunction with the recent variationally enhanced sampling method [O. Valsson, M. Parrinello, Phys. Rev. Lett. 113, 090601 (2014)] in order to determine such potential. This is achieved by expressing the bias as a neural network. The parameters are determined in a variational learning scheme aimed at minimizing an appropriate functional. This required the development of a more efficient minimization technique. The expressivity of neural networks allows representing rapidly varying free-energy surfaces, removes boundary effects artifacts, and allows several collective variables to be handled.


2021 ◽  
Vol 118 (44) ◽  
pp. e2113533118
Author(s):  
Luigi Bonati ◽  
GiovanniMaria Piccini ◽  
Michele Parrinello

The development of enhanced sampling methods has greatly extended the scope of atomistic simulations, allowing long-time phenomena to be studied with accessible computational resources. Many such methods rely on the identification of an appropriate set of collective variables. These are meant to describe the system’s modes that most slowly approach equilibrium under the action of the sampling algorithm. Once identified, the equilibration of these modes is accelerated by the enhanced sampling method of choice. An attractive way of determining the collective variables is to relate them to the eigenfunctions and eigenvalues of the transfer operator. Unfortunately, this requires knowing the long-term dynamics of the system beforehand, which is generally not available. However, we have recently shown that it is indeed possible to determine efficient collective variables starting from biased simulations. In this paper, we bring the power of machine learning and the efficiency of the recently developed on the fly probability-enhanced sampling method to bear on this approach. The result is a powerful and robust algorithm that, given an initial enhanced sampling simulation performed with trial collective variables or generalized ensembles, extracts transfer operator eigenfunctions using a neural network ansatz and then accelerates them to promote sampling of rare events. To illustrate the generality of this approach, we apply it to several systems, ranging from the conformational transition of a small molecule to the folding of a miniprotein and the study of materials crystallization.


2018 ◽  
Author(s):  
Z. Faidon Brotzakis ◽  
Michele Parrinello

AbstractProtein conformational transitions often involve many slow degrees of freedom. Their knowledge would give distinctive advantages since it provides chemical and mechanistic insight and accelerates the convergence of enhanced sampling techniques that rely on collective variables. In this study, we implemented a recently developed variational approach to conformational dynamics metadynamics to the conformational transition of the moderate size protein, L99A T4 Lysozyme. In order to find the slow modes of the system we combined data coming from NMR experiments as well as short MD simulations. A Metadynamics simulation based on these information reveals the presence of two intermediate states, at an affordable computational cost.


2018 ◽  
Vol 149 (7) ◽  
pp. 072001 ◽  
Author(s):  
Alessandro Laio ◽  
Athanassios Z. Panagiotopoulos ◽  
Daniel M. Zuckerman

Sign in / Sign up

Export Citation Format

Share Document