scholarly journals Estimating kinetic mechanisms with prior knowledge I: Linear parameter constraints

2018 ◽  
Vol 150 (2) ◽  
pp. 323-338 ◽  
Author(s):  
Autoosa Salari ◽  
Marco A. Navarro ◽  
Mirela Milescu ◽  
Lorin S. Milescu

To understand how ion channels and other proteins function at the molecular and cellular levels, one must decrypt their kinetic mechanisms. Sophisticated algorithms have been developed that can be used to extract kinetic parameters from a variety of experimental data types. However, formulating models that not only explain new data, but are also consistent with existing knowledge, remains a challenge. Here, we present a two-part study describing a mathematical and computational formalism that can be used to enforce prior knowledge into the model using constraints. In this first part, we focus on constraints that enforce explicit linear relationships involving rate constants or other model parameters. We develop a simple, linear algebra–based transformation that can be applied to enforce many types of model properties and assumptions, such as microscopic reversibility, allosteric gating, and equality and inequality parameter relationships. This transformation converts the set of linearly interdependent model parameters into a reduced set of independent parameters, which can be passed to an automated search engine for model optimization. In the companion article, we introduce a complementary method that can be used to enforce arbitrary parameter relationships and any constraints that quantify the behavior of the model under certain conditions. The procedures described in this study can, in principle, be coupled to any of the existing methods for solving molecular kinetics for ion channels or other proteins. These concepts can be used not only to enforce existing knowledge but also to formulate and test new hypotheses.

2018 ◽  
Vol 150 (2) ◽  
pp. 339-354 ◽  
Author(s):  
Marco A. Navarro ◽  
Autoosa Salari ◽  
Mirela Milescu ◽  
Lorin S. Milescu

Kinetic mechanisms predict how ion channels and other proteins function at the molecular and cellular levels. Ideally, a kinetic model should explain new data but also be consistent with existing knowledge. In this two-part study, we present a mathematical and computational formalism that can be used to enforce prior knowledge into kinetic models using constraints. Here, we focus on constraints that quantify the behavior of the model under certain conditions, and on constraints that enforce arbitrary parameter relationships. The penalty-based optimization mechanism described here can be used to enforce virtually any model property or behavior, including those that cannot be easily expressed through mathematical relationships. Examples include maximum open probability, use-dependent availability, and nonlinear parameter relationships. We use a simple kinetic mechanism to test multiple sets of constraints that implement linear parameter relationships and arbitrary model properties and behaviors, and we provide numerical examples. This work complements and extends the companion article, where we show how to enforce explicit linear parameter relationships. By incorporating more knowledge into the parameter estimation procedure, it is possible to obtain more realistic and robust models with greater predictive power.


2020 ◽  
Author(s):  
Guangyu Li ◽  
Chieh Wu ◽  
Dongqi Wang ◽  
Varun Srinivasan ◽  
David R. Kaeli ◽  
...  

ABSTRACTRapid progress in various advanced analytical methods such as single-cell technologies enable unprecedented and deeper understanding of microbial ecology beyond the resolution of conventional approaches. A major application challenge exists in the determination of sufficient sample size without sufficient prior knowledge of the community complexity and, the need to balance between statistical power and limited time or resources. This hinders the desired standardization and wider application of these technologies. Here, we proposed, tested and validated a computational sampling size assessment protocol taking advantage of a metric, named kernel divergence. This metric has two advantages: First, it directly compares dataset-wise distributional differences with no requirements on human intervention or prior knowledge-based pre-classification. Second, minimal assumptions in distribution and sample space are made in data processing to enhance its application domain. This enables test-verified appropriate handling of datasets with both linear and non-linear relationships. The model was then validated in a case study with eight SCRS phenotyping datasets each sampled from a different enhanced biological phosphorus removal (EBPR) activated sludge community located across North America. The model allows the determination of sufficient sampling size for any targeted or customized information capture capacity or resolution level. For example, an approximated sampling size of 50 or 100 spectra for full-scale EBPR-related ecosystems at 5% or 2% OPU cluster resolution. Promised by its flexibility and minimal restriction of input data types, the proposed method is expected to be a standardized approach for sampling size optimization, enabling more comparable and reproducible experiments and analysis on complex environmental samples. Finally, these advantages exhibit the capability of generalizing to other single-cell technologies or environmental applications, provided that the input datasets contain only continuous features.TOC


2019 ◽  
Vol 52 (1) ◽  
pp. 193-200 ◽  
Author(s):  
Andrew R. J. Nelson ◽  
Stuart W. Prescott

refnx is a model-based neutron and X-ray reflectometry data analysis package written in Python. It is cross platform and has been tested on Linux, macOS and Windows. Its graphical user interface is browser based, through a Jupyter notebook. Model construction is modular, being composed from a series of components that each describe a subset of the interface, parameterized in terms of physically relevant parameters (volume fraction of a polymer, lipid area per molecule etc.). The model and data are used to create an objective, which is used to calculate the residuals, log-likelihood and log-prior probabilities of the system. Objectives are combined to perform co-refinement of multiple data sets and mixed-area models. Prior knowledge of parameter values is encoded as probability distribution functions or bounds on all parameters in the system. Additional prior probability terms can be defined for sets of components, over and above those available from the parameters alone. Algebraic parameter constraints are available. The software offers a choice of fitting approaches, including least-squares (global and gradient-based optimizers) and a Bayesian approach using a Markov-chain Monte Carlo algorithm to investigate the posterior distribution of the model parameters. The Bayesian approach is useful for examining parameter covariances, model selection and variability in the resulting scattering length density profiles. The package is designed to facilitate reproducible research; its use in Jupyter notebooks, and subsequent distribution of those notebooks as supporting information, permits straightforward reproduction of analyses.


2020 ◽  
Vol 500 (2) ◽  
pp. 2704-2710 ◽  
Author(s):  
Yun-Wei Yu ◽  
Yuan-Chuan Zou ◽  
Zi-Gao Dai ◽  
Wen-Fei Yu

ABSTRACT The association of FRB 200428 with an X-ray burst (XRB) from the Galactic magnetar SGR 1935+2154 offers important implications for the physical processes responsible for the fast radio burst (FRB) phenomena. By assuming that the XRB emission is produced in the magnetosphere, we investigate the possibility that the FRB emission is produced by shock-powered synchrotron maser (SM), which is phenomenologically described with a number of free parameters. The observational constraints on the model parameters indicate that the model can in principle be consistent with the FRB 200428 observations, if the ejecta lunched by magnetar activities can have appropriate ingredients and structures and the shock processes occur on the line of sight. To be specific, a complete burst ejecta should consist of an ultra-relativistic and extremely highly collimated e± component and a sub-relativistic and wide-spreading baryonic component. The internal shocks producing the FRB emission arise from a collision between the e± ejecta and the remnant of a previous baryonic ejecta at the same direction. The parameter constraints depend on the uncertain spectrum and efficiency of the SM emission. While the spectrum is tentatively described by a spectral index of −2, we estimate the emission efficiency to be around 10−4 by requiring that the synchrotron emission of the shocked material cannot be much brighter than the magnetosphere XRB emission.


2021 ◽  
Author(s):  
Florian Wellmann ◽  
Miguel de la Varga ◽  
Nilgün Güdük ◽  
Jan von Harten ◽  
Fabian Stamm ◽  
...  

<p>Geological models, as 3-D representations of subsurface structures and property distributions, are used in many economic, scientific, and societal decision processes. These models are built on prior assumptions and imperfect information, and they often result from an integration of geological and geophysical data types with varying quality. These aspects result in uncertainties about the predicted subsurface structures and property distributions, which will affect the subsequent decision process.</p><p>We discuss approaches to evaluate uncertainties in geological models and to integrate geological and geophysical information in combined workflows. A first step is the consideration of uncertainties in prior model parameters on the basis of uncertainty propagation (forward uncertainty quantification). When applied to structural geological models with discrete classes, these methods result in a class probability for each point in space, often represented in tessellated grid cells. These results can then be visualized or forwarded to process simulations. Another option is to add risk functions for subsequent decision analyses. In recent work, these geological uncertainty fields have also been used as an input to subsequent geophysical inversions.</p><p>A logical extension to these existing approaches is the integration of geological forward operators into inverse frameworks, to enable a full flow of inference for a wider range of relevant parameters. We investigate here specifically the use of probabilistic machine learning tools in combination with geological and geophysical modeling. Challenges exist due to the hierarchical nature of the probabilistic models, but modern sampling strategies allow for efficient sampling in these complex settings. We showcase the application with examples combining geological modeling and geophysical potential field measurements in an integrated model for improved decision making.</p>


2020 ◽  
Vol 497 (1) ◽  
pp. 263-278 ◽  
Author(s):  
Narayan Khadka ◽  
Bharat Ratra

ABSTRACT Risaliti and Lusso have compiled X-ray and UV flux measurements of 1598 quasars (QSOs) in the redshift range 0.036 ≤ z ≤ 5.1003, part of which, z ∼ 2.4 − 5.1, is largely cosmologically unprobed. In this paper we use these QSO measurements, alone and in conjunction with baryon acoustic oscillation (BAO) and Hubble parameter [H(z)] measurements, to constrain cosmological parameters in six different cosmological models, each with two different Hubble constant priors. In most of these models, given the larger uncertainties, the QSO cosmological parameter constraints are mostly consistent with those from the BAO + H(z) data. A somewhat significant exception is the non-relativistic matter density parameter Ωm0 where QSO data favour Ωm0 ∼ 0.5 − 0.6 in most models. As a result, in joint analyses of QSO data with H(z) + BAO data the 1D Ωm0 distributions shift slightly towards larger values. A joint analysis of the QSO + BAO + H(z) data is consistent with the current standard model, spatially-flat ΛCDM, but mildly favours closed spatial hypersurfaces and dynamical dark energy. Since the higher Ωm0 values favoured by QSO data appear to be associated with the z ∼ 2 − 5 part of these data, and conflict somewhat with strong indications for Ωm0 ∼ 0.3 from most z < 2.5 data as well as from the cosmic microwave background anisotropy data at z ∼ 1100, in most models, the larger QSO data Ωm0 is possibly more indicative of an issue with the z ∼ 2 − 5 QSO data than of an inadequacy of the standard flat ΛCDM model.


2019 ◽  
Vol 485 (2) ◽  
pp. 2806-2824 ◽  
Author(s):  
Linda Blot ◽  
Martin Crocce ◽  
Emiliano Sefusatti ◽  
Martha Lippich ◽  
Ariel G Sánchez ◽  
...  

ABSTRACT We study the accuracy of several approximate methods for gravitational dynamics in terms of halo power spectrum multipoles and their estimated covariance matrix. We propagate the differences in covariances into parameter constraints related to growth rate of structure, Alcock–Paczynski distortions, and biasing. We consider seven methods in three broad categories: algorithms that solve for halo density evolution deterministically using Lagrangian trajectories (ICE–COLA, pinocchio, and peakpatch), methods that rely on halo assignment schemes on to dark matter overdensities calibrated with a target N-body run (halogen, patchy), and two standard assumptions about the full density probability distribution function (Gaussian and lognormal). We benchmark their performance against a set of three hundred N-body simulations, running similar sets of approximate simulations with matched initial conditions, for each method. We find that most methods reproduce the monopole to within $5{{\ \rm per\ cent}}$, while residuals for the quadrupole are sometimes larger and scale dependent. The variance of the multipoles is typically reproduced within $10{{\ \rm per\ cent}}$. Overall, we find that covariances built from approximate simulations yield errors on model parameters within $10{{\ \rm per\ cent}}$ of those from the N-body-based covariance.


Biometrika ◽  
2020 ◽  
Vol 107 (3) ◽  
pp. 609-625 ◽  
Author(s):  
Grace Yoon ◽  
Raymond J Carroll ◽  
Irina Gaynanova

Summary Canonical correlation analysis investigates linear relationships between two sets of variables, but it often works poorly on modern datasets because of high dimensionality and mixed data types such as continuous, binary and zero-inflated. To overcome these challenges, we propose a semiparametric approach to sparse canonical correlation analysis based on the Gaussian copula. The main result of this paper is a truncated latent Gaussian copula model for data with excess zeros, which allows us to derive a rank-based estimator of the latent correlation matrix for mixed variable types without estimation of marginal transformation functions. The resulting canonical correlation analysis method works well in high-dimensional settings, as demonstrated via numerical studies, and when applied to the analysis of association between gene expression and microRNA data from breast cancer patients.


Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 907 ◽  
Author(s):  
Ricardo da Rosa ◽  
Marco Aurelio Wehrmeister ◽  
Thadeu Brito ◽  
José Luís Lima ◽  
Ana Isabel Pinheiro Nunes Pereira

The use of robots to map disaster-stricken environments can prevent rescuers from being harmed when exploring an unknown space. In addition, mapping a multi-robot environment can help these teams plan their actions with prior knowledge. The present work proposes the use of multiple unmanned aerial vehicles (UAVs) in the construction of a topological map inspired by the way that bees build their hives. A UAV can map a honeycomb only if it is adjacent to a known one. Different metrics to choose the honeycomb to be explored were applied. At the same time, as UAVs scan honeycomb adjacencies, RGB-D and thermal sensors capture other data types, and then generate a 3D view of the space and images of spaces where there may be fire spots, respectively. Simulations in different environments showed that the choice of metric and variation in the number of UAVs influence the number of performed displacements in the environment, consequently affecting exploration time and energy use.


2017 ◽  
Vol 112 (3) ◽  
pp. 243a
Author(s):  
Autoosa Salari ◽  
Zachary F. Elkins ◽  
Marco A. Navarro ◽  
Benton R. Berigan ◽  
Jenna L. Lin ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document