scholarly journals Estimating likelihood from simulations in bayesian inference of ecological models with historical contingency

2018 ◽  
Author(s):  
Stéphane Dupas

AbstractEcological patterns result from historical contingency and deterministic processes. Taking apart these processes to extract probabilistic models of ecological dynamics is of major importance for ecological forecasting. Due to the high dimensionality of historical contingency it is usually difficult to sample history from observed patterns. In environmental population genetics, the number of possible genealogies linking genetic data and environemental data through demographic and niche models is almost infinite. In ecosystem dynamics time series, the patterns are determined as much by probabilitic model parameters, as by historical variables contingency. Aproximate bayesian computation allows to use simulations to aproximate this inference process. The rationale is to simulate data, extract summary statistics and retain in the posterior, the parameters values that produced simulations with summary statistic close to observed summary statistics. The major drawbacks of this approach is that summary statistics distance is not exhaustive regarding model likelihood and may biais the results. In the present work, I show that if we can simulate the historical contingency from observed data and probabilistic model in a backward approach, we can to use the simulations to estimate a pseudo likelihood that can be used in bayesian inference. I apply to genealogy sampling in environmental demogenetics and ecosystem dynamics modelling.

2017 ◽  
Vol 14 (134) ◽  
pp. 20170340 ◽  
Author(s):  
Aidan C. Daly ◽  
Jonathan Cooper ◽  
David J. Gavaghan ◽  
Chris Holmes

Bayesian methods are advantageous for biological modelling studies due to their ability to quantify and characterize posterior variability in model parameters. When Bayesian methods cannot be applied, due either to non-determinism in the model or limitations on system observability, approximate Bayesian computation (ABC) methods can be used to similar effect, despite producing inflated estimates of the true posterior variance. Owing to generally differing application domains, there are few studies comparing Bayesian and ABC methods, and thus there is little understanding of the properties and magnitude of this uncertainty inflation. To address this problem, we present two popular strategies for ABC sampling that we have adapted to perform exact Bayesian inference, and compare them on several model problems. We find that one sampler was impractical for exact inference due to its sensitivity to a key normalizing constant, and additionally highlight sensitivities of both samplers to various algorithmic parameters and model conditions. We conclude with a study of the O'Hara–Rudy cardiac action potential model to quantify the uncertainty amplification resulting from employing ABC using a set of clinically relevant biomarkers. We hope that this work serves to guide the implementation and comparative assessment of Bayesian and ABC sampling techniques in biological models.


2020 ◽  
Vol 70 (1) ◽  
pp. 145-161 ◽  
Author(s):  
Marnus Stoltz ◽  
Boris Baeumer ◽  
Remco Bouckaert ◽  
Colin Fox ◽  
Gordon Hiscott ◽  
...  

Abstract We describe a new and computationally efficient Bayesian methodology for inferring species trees and demographics from unlinked binary markers. Likelihood calculations are carried out using diffusion models of allele frequency dynamics combined with novel numerical algorithms. The diffusion approach allows for analysis of data sets containing hundreds or thousands of individuals. The method, which we call Snapper, has been implemented as part of the BEAST2 package. We conducted simulation experiments to assess numerical error, computational requirements, and accuracy recovering known model parameters. A reanalysis of soybean SNP data demonstrates that the models implemented in Snapp and Snapper can be difficult to distinguish in practice, a characteristic which we tested with further simulations. We demonstrate the scale of analysis possible using a SNP data set sampled from 399 fresh water turtles in 41 populations. [Bayesian inference; diffusion models; multi-species coalescent; SNP data; species trees; spectral methods.]


2016 ◽  
Vol 62 (1) ◽  
pp. 61-64
Author(s):  
Beata Krupanek ◽  
Ryszard Bogacz

Abstract The paper presents a new conception of building probabilistic models of communication delays in wireless networks that basis on using a delta function sequence to describe retransmissions between a transmitter and a receiver. It is assumed that the access time of the transmitter is described by a probability density function and the communication channel established in the wireless medium is disturbed by passive or active factors which cause that the transmission can be not correct and the sent data have to be retransmitted. Theoretical considerations have been verified by measurement results obtained by using the experimental system developed for investigating delays caused by external disturbances influencing the wireless transmission. A method of identification of the proposed model parameters and verification of the identified values has been presented.


2021 ◽  
Author(s):  
Florian Wellmann ◽  
Miguel de la Varga ◽  
Nilgün Güdük ◽  
Jan von Harten ◽  
Fabian Stamm ◽  
...  

<p>Geological models, as 3-D representations of subsurface structures and property distributions, are used in many economic, scientific, and societal decision processes. These models are built on prior assumptions and imperfect information, and they often result from an integration of geological and geophysical data types with varying quality. These aspects result in uncertainties about the predicted subsurface structures and property distributions, which will affect the subsequent decision process.</p><p>We discuss approaches to evaluate uncertainties in geological models and to integrate geological and geophysical information in combined workflows. A first step is the consideration of uncertainties in prior model parameters on the basis of uncertainty propagation (forward uncertainty quantification). When applied to structural geological models with discrete classes, these methods result in a class probability for each point in space, often represented in tessellated grid cells. These results can then be visualized or forwarded to process simulations. Another option is to add risk functions for subsequent decision analyses. In recent work, these geological uncertainty fields have also been used as an input to subsequent geophysical inversions.</p><p>A logical extension to these existing approaches is the integration of geological forward operators into inverse frameworks, to enable a full flow of inference for a wider range of relevant parameters. We investigate here specifically the use of probabilistic machine learning tools in combination with geological and geophysical modeling. Challenges exist due to the hierarchical nature of the probabilistic models, but modern sampling strategies allow for efficient sampling in these complex settings. We showcase the application with examples combining geological modeling and geophysical potential field measurements in an integrated model for improved decision making.</p>


2019 ◽  
Vol 7 (1) ◽  
pp. 13-27
Author(s):  
Safaa K. Kadhem ◽  
Sadeq A. Kadhim

"This paper aims at the modeling the crashes count in Al Muthanna governance using finite mixture model. We use one of the most common MCMC method which is called the Gibbs sampler to implement the Bayesian inference for estimating the model parameters. We perform a simulation study, based on synthetic data, to check the ability of the sampler to find the best estimates of the model. We use the two well-known criteria, which are the AIC and BIC, to determine the best model fitted to the data. Finally, we apply our sampler to model the crashes count in Al Muthanna governance.


2020 ◽  
Vol 239 ◽  
pp. 13003
Author(s):  
D. Kumar ◽  
S. B. Alam ◽  
H. Sjöstrand ◽  
J.M. Palau ◽  
C. De Saint Jean

The mathematical models used for nuclear data evaluations contain a large number of theoretical parameters that are usually uncertain. These parameters can be calibrated (or improved) by the information collected from integral/differential experiments. The Bayesian inference technique is used to utilize measurements for data assimilation. The Bayesian approximation is based on the least-square or Monte-Carlo approaches. In this process, the model parameters are optimized. In the adjustment process, it is essential to include the analysis related to the influence of model parameters on the adjusted data. In this work, some statistical indicators such as the concept of Cook’s distance; Akaike, Bayesian and deviance information criteria; effective degrees of freedom are developed within the CONRAD platform. Further, these indicators are applied to a test case of 155Gd to evaluate and compare the influence of resonance parameters.


2019 ◽  
Vol 36 (2) ◽  
pp. 586-593
Author(s):  
Boseung Choi ◽  
Yu-Yu Cheng ◽  
Selahattin Cinar ◽  
William Ott ◽  
Matthew R Bennett ◽  
...  

Abstract Motivation Advances in experimental and imaging techniques have allowed for unprecedented insights into the dynamical processes within individual cells. However, many facets of intracellular dynamics remain hidden, or can be measured only indirectly. This makes it challenging to reconstruct the regulatory networks that govern the biochemical processes underlying various cell functions. Current estimation techniques for inferring reaction rates frequently rely on marginalization over unobserved processes and states. Even in simple systems this approach can be computationally challenging, and can lead to large uncertainties and lack of robustness in parameter estimates. Therefore we will require alternative approaches to efficiently uncover the interactions in complex biochemical networks. Results We propose a Bayesian inference framework based on replacing uninteresting or unobserved reactions with time delays. Although the resulting models are non-Markovian, recent results on stochastic systems with random delays allow us to rigorously obtain expressions for the likelihoods of model parameters. In turn, this allows us to extend MCMC methods to efficiently estimate reaction rates, and delay distribution parameters, from single-cell assays. We illustrate the advantages, and potential pitfalls, of the approach using a birth–death model with both synthetic and experimental data, and show that we can robustly infer model parameters using a relatively small number of measurements. We demonstrate how to do so even when only the relative molecule count within the cell is measured, as in the case of fluorescence microscopy. Availability and implementation Accompanying code in R is available at https://github.com/cbskust/DDE_BD. Supplementary information Supplementary data are available at Bioinformatics online.


2011 ◽  
Vol 128-129 ◽  
pp. 637-641
Author(s):  
Lan Luo ◽  
Qiong Hai Dai ◽  
Chun Xiang Xu ◽  
Shao Quan Jiang

The cipher algorithms are categorized by block cipher, stream cipher and HASH, and they are weighed in faithful transmission which is known as independent condition. In faithful transmission, the ciphers are studied because of their root cipher. Intelligent application of ciphers is a direction that uses Bayesian model of cognition science. Bayesian inference is a rational engine for solving such problems within a probabilistic framework, and consequently is the heart of most probabilistic models of weighing the ciphers. The approach of this paper is that ciphers, which are considered as a suitable weight cipher to kinds of networks, are ranged based on root ciphers. This paper shows the other kinds of transformation among the different cipher algorithms themselves.


2017 ◽  
Vol 52 (2) ◽  
pp. 169-195 ◽  
Author(s):  
Ghulam Mustafa ◽  
Afzal Suleman ◽  
Curran Crawford

This paper presents a probabilistic first ply failure analysis of composite laminates using a high-fidelity multi-scale approach called M-SaF (Micromechanics-based approach for Static Failure). To this end, square and hexagonal representative unit cells of composites are developed to calculate constituent stresses with the help of a bridging matrix between macro and micro stresses referred to as the stress amplification factor matrix. Separate failure criteria are applied to each of the constituents (fiber, matrix, and interface) in order to calculate the damage state. The successful implementation of M-SaF requires strength properties of the constituents which are the most difficult and expensive to characterize experimentally, limiting the use of M-SaF in the early design stages of a structure. This obstacle is overcome by integrating a Bayesian inference approach with M-SaF. An academic sample problem of a cantilever beam is used to first demonstrate the calibration procedure. Bayesian inference calibrates the M-SaF first ply failure model parameters as posterior distributions from the prior probability density functions drawn from lamina test data. The posterior statistics were then used to calculate probabilistic first ply failure for a range of different laminates.


Mathematics ◽  
2020 ◽  
Vol 8 (11) ◽  
pp. 1942
Author(s):  
Andrés R. Masegosa ◽  
Darío Ramos-López ◽  
Antonio Salmerón ◽  
Helge Langseth ◽  
Thomas D. Nielsen

In many modern data analysis problems, the available data is not static but, instead, comes in a streaming fashion. Performing Bayesian inference on a data stream is challenging for several reasons. First, it requires continuous model updating and the ability to handle a posterior distribution conditioned on an unbounded data set. Secondly, the underlying data distribution may drift from one time step to another, and the classic i.i.d. (independent and identically distributed), or data exchangeability assumption does not hold anymore. In this paper, we present an approximate Bayesian inference approach using variational methods that addresses these issues for conjugate exponential family models with latent variables. Our proposal makes use of a novel scheme based on hierarchical priors to explicitly model temporal changes of the model parameters. We show how this approach induces an exponential forgetting mechanism with adaptive forgetting rates. The method is able to capture the smoothness of the concept drift, ranging from no drift to abrupt drift. The proposed variational inference scheme maintains the computational efficiency of variational methods over conjugate models, which is critical in streaming settings. The approach is validated on four different domains (energy, finance, geolocation, and text) using four real-world data sets.


Sign in / Sign up

Export Citation Format

Share Document