scholarly journals A new Bayesian methodology for nonlinear model calibration in Computational Systems Biology

2019 ◽  
Author(s):  
Fortunato Bianconi ◽  
Lorenzo Tomassoni ◽  
Chiara Antonini ◽  
Paolo Valigi

AbstractComputational modeling is a common tool to quantitatively describe biological processes. However, most model parameters are usually unknown because they cannot be directly measured. Therefore, a key issue in Systems Biology is model calibration, i.e. estimate parameters from experimental data. Existing methodologies for parameter estimation are divided in two classes: frequentist and Bayesian methods. The first ones optimize a cost function while the second ones estimate the parameter posterior distribution through different sampling techniques. Here, we present an innovative Bayesian method, called Conditional Robust Calibration (CRC), for nonlinear model calibration and robustness analysis using omics data. CRC is an iterative algorithm based on the sampling of a proposal distribution and on the definition of multiple objective functions, one for each observable. CRC estimates the probability density function (pdf) of parameters conditioned to the experimental measures and it performs a robustness analysis, quantifying how much each parameter influences the observables behavior. We apply CRC to three Ordinary Differential Equations (ODE) models to test its performances compared to the other state of the art approaches, namely Profile Likelihood (PL), Approximate Bayesian Computation Sequential Monte Carlo (ABC-SMC) and Delayed Rejection Adaptive Metropolis (DRAM). Compared with these methods, CRC finds a robust solution with a reduced computational cost. CRC is developed as a set of Matlab functions (version R2018), whose fundamental source code is freely available at https://github.com/fortunatobianconi/CRC.

2017 ◽  
Author(s):  
Fortunato Bianconi ◽  
Chiara Antonini ◽  
Lorenzo Tomassoni ◽  
Paolo Valigi

AbstractComputational modeling is a remarkable and common tool to quantitatively describe a biological process. However, most model parameters, such as kinetics parameters, initial conditions and scale factors, are usually unknown because they cannot be directly measured.Therefore, key issues in Systems Biology are model calibration and identifiability analysis, i.e. estimate parameters from experimental data and assess how well those parameters are determined by the dimension and quality of the data.Currently in the Systems Biology and Computational Biology communities, the existing methodologies for parameter estimation are divided in two classes: frequentist methods and Bayesian methods. The first ones are based on the optimization of a cost function while the second ones estimate the posterior distribution of model parameters through different sampling techniques.In this work, we present an innovative Bayesian method, called Conditional Robust Calibration (CRC), for model calibration and identifiability analysis. The algorithm is an iterative procedure based on parameter space sampling and on the definition of multiple objective functions related to each output variables. The method estimates step by step the probability density function (pdf) of parameters conditioned to the experimental measures and it returns as output a subset in the parameter space that best reproduce the dataset.We apply CRC to six Ordinary Differential Equations (ODE) models with different characteristics and complexity to test its performances compared with profile likelihood (PL) and Approximate Bayesian Computation Sequential Montecarlo (ABC-SMC) approaches. The datasets selected for calibration are time course measurements of different nature: noisy or noiseless, real or in silico.Compared with PL, our approach finds a more robust solution because parameter identifiability is inferred by conditional pdfs of estimated parameters. Compared with ABC-SMC, we have found a more precise solution with a reduced computational cost.


2017 ◽  
Vol 12 (4) ◽  
Author(s):  
Yousheng Chen ◽  
Andreas Linderholt ◽  
Thomas J. S. Abrahamsson

Correlation and calibration using test data are natural ingredients in the process of validating computational models. Model calibration for the important subclass of nonlinear systems which consists of structures dominated by linear behavior with the presence of local nonlinear effects is studied in this work. The experimental validation of a nonlinear model calibration method is conducted using a replica of the École Centrale de Lyon (ECL) nonlinear benchmark test setup. The calibration method is based on the selection of uncertain model parameters and the data that form the calibration metric together with an efficient optimization routine. The parameterization is chosen so that the expected covariances of the parameter estimates are made small. To obtain informative data, the excitation force is designed to be multisinusoidal and the resulting steady-state multiharmonic frequency response data are measured. To shorten the optimization time, plausible starting seed candidates are selected using the Latin hypercube sampling method. The candidate parameter set giving the smallest deviation to the test data is used as a starting point for an iterative search for a calibration solution. The model calibration is conducted by minimizing the deviations between the measured steady-state multiharmonic frequency response data and the analytical counterparts that are calculated using the multiharmonic balance method. The resulting calibrated model's output corresponds well with the measured responses.


2019 ◽  
Author(s):  
Jan Mikelson ◽  
Mustafa Khammash

The development of mechanistic models of biological systems is a central part of Systems Biology. One major challenge in developing these models is the accurate inference of the model parameters. In the past years, nested sampling methods have gained an increasing amount of attention in the Systems Biology community. Some of the rather attractive features of these methods include that they are easily parallelizable and give an estimation of the variance of the final Bayesian evidence estimate from a single run. Still, the applicability of these methods is limited as they require the likelihood to be available and thus cannot be applied to stochastic systems with intractable likelihoods. In this paper, we present a likelihood-free nested sampling formulation that gives an unbiased estimator of the Bayesian evidence as well as samples from the posterior. Unlike most common nested sampling schemes we propose to use the information about the samples from the final prior volume to aid in the approximation of the Bayesian evidence and show how this allows us to formulate a lower bound on the variance of the obtained estimator. We proceed and use this lower bound to formulate a novel termination criterion for nested sampling approaches. We illustrate how our approach is applied to several realistically sized models with simulated data as well as recently published biological data. The presented method provides a viable alternative to other likelihood-free inference schemes such as Sequential Monte Carlo or Approximate Bayesian Computations methods. We also provide an intuitive and performative C++ implementation of our method.


2020 ◽  
Vol 16 (12) ◽  
pp. e1008495
Author(s):  
Ivan Borisov ◽  
Evgeny Metelkin

Practical identifiability of Systems Biology models has received a lot of attention in recent scientific research. It addresses the crucial question for models’ predictability: how accurately can the models’ parameters be recovered from available experimental data. The methods based on profile likelihood are among the most reliable methods of practical identification. However, these methods are often computationally demanding or lead to inaccurate estimations of parameters’ confidence intervals. Development of methods, which can accurately produce parameters’ confidence intervals in reasonable computational time, is of utmost importance for Systems Biology and QSP modeling. We propose an algorithm Confidence Intervals by Constraint Optimization (CICO) based on profile likelihood, designed to speed-up confidence intervals estimation and reduce computational cost. The numerical implementation of the algorithm includes settings to control the accuracy of confidence intervals estimates. The algorithm was tested on a number of Systems Biology models, including Taxol treatment model and STAT5 Dimerization model, discussed in the current article. The CICO algorithm is implemented in a software package freely available in Julia (https://github.com/insysbio/LikelihoodProfiler.jl) and Python (https://github.com/insysbio/LikelihoodProfiler.py).


2017 ◽  
Vol 14 (134) ◽  
pp. 20170340 ◽  
Author(s):  
Aidan C. Daly ◽  
Jonathan Cooper ◽  
David J. Gavaghan ◽  
Chris Holmes

Bayesian methods are advantageous for biological modelling studies due to their ability to quantify and characterize posterior variability in model parameters. When Bayesian methods cannot be applied, due either to non-determinism in the model or limitations on system observability, approximate Bayesian computation (ABC) methods can be used to similar effect, despite producing inflated estimates of the true posterior variance. Owing to generally differing application domains, there are few studies comparing Bayesian and ABC methods, and thus there is little understanding of the properties and magnitude of this uncertainty inflation. To address this problem, we present two popular strategies for ABC sampling that we have adapted to perform exact Bayesian inference, and compare them on several model problems. We find that one sampler was impractical for exact inference due to its sensitivity to a key normalizing constant, and additionally highlight sensitivities of both samplers to various algorithmic parameters and model conditions. We conclude with a study of the O'Hara–Rudy cardiac action potential model to quantify the uncertainty amplification resulting from employing ABC using a set of clinically relevant biomarkers. We hope that this work serves to guide the implementation and comparative assessment of Bayesian and ABC sampling techniques in biological models.


Cancers ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 35
Author(s):  
Sahar Aghakhani ◽  
Naouel Zerrouk ◽  
Anna Niarakis

Fibroblasts, the most abundant cells in the connective tissue, are key modulators of the extracellular matrix (ECM) composition. These spindle-shaped cells are capable of synthesizing various extracellular matrix proteins and collagen. They also provide the structural framework (stroma) for tissues and play a pivotal role in the wound healing process. While they are maintainers of the ECM turnover and regulate several physiological processes, they can also undergo transformations responding to certain stimuli and display aggressive phenotypes that contribute to disease pathophysiology. In this review, we focus on the metabolic pathways of glucose and highlight metabolic reprogramming as a critical event that contributes to the transition of fibroblasts from quiescent to activated and aggressive cells. We also cover the emerging evidence that allows us to draw parallels between fibroblasts in autoimmune disorders and more specifically in rheumatoid arthritis and cancer. We link the metabolic changes of fibroblasts to the toxic environment created by the disease condition and discuss how targeting of metabolic reprogramming could be employed in the treatment of such diseases. Lastly, we discuss Systems Biology approaches, and more specifically, computational modeling, as a means to elucidate pathogenetic mechanisms and accelerate the identification of novel therapeutic targets.


Water ◽  
2021 ◽  
Vol 13 (11) ◽  
pp. 1484
Author(s):  
Dagmar Dlouhá ◽  
Viktor Dubovský ◽  
Lukáš Pospíšil

We present an approach for the calibration of simplified evaporation model parameters based on the optimization of parameters against the most complex model for evaporation estimation, i.e., the Penman–Monteith equation. This model computes the evaporation from several input quantities, such as air temperature, wind speed, heat storage, net radiation etc. However, sometimes all these values are not available, therefore we must use simplified models. Our interest in free water surface evaporation is given by the need for ongoing hydric reclamation of the former Ležáky–Most quarry, i.e., the ongoing restoration of the land that has been mined to a natural and economically usable state. For emerging pit lakes, the prediction of evaporation and the level of water plays a crucial role. We examine the methodology on several popular models and standard statistical measures. The presented approach can be applied in a general model calibration process subject to any theoretical or measured evaporation.


Sign in / Sign up

Export Citation Format

Share Document