Bayesian statistic methods and theri application in probabilistic simulation models

2007 ◽  
Vol 8 (1) ◽  
pp. 5-13 ◽  
Author(s):  
Sergio Iannazzo

Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

Author(s):  
Robert L. Grant ◽  
Bob Carpenter ◽  
Daniel C. Furr ◽  
Andrew Gelman

In this article, we present StataStan, an interface that allows simulation-based Bayesian inference in Stata via calls to Stan, the flexible, open-source Bayesian inference engine. Stan is written in C++, and Stata users can use the commands stan and windowsmonitor to run Stan programs from within Stata. We provide a brief overview of Bayesian algorithms, details of the commands available from Statistical Software Components, considerations for users who are new to Stan, and a simple example. Stan uses a different algorithm than bayesmh, BUGS, JAGS, SAS, and MLwiN. This algorithm provides considerable improvements in efficiency and speed. In a companion article, we give an extended comparison of StataStan and bayesmh in the context of item response theory models.


2009 ◽  
pp. 090202092741069-34
Author(s):  
Yuan Sophie Liu ◽  
Angela Yu ◽  
Philip Holmes

2003 ◽  
Vol 7 (2) ◽  
pp. 197-211 ◽  
Author(s):  
J. P. Whiting ◽  
M. F. Lambert ◽  
A. V. Metcalfe

Abstract. Annual rainfall time series for Sydney from 1859 to 1999 is analysed. Clear evidence of nonstationarity is presented, but substantial evidence for persistence or hidden states is more elusive. A test of the hypothesis that a hidden state Markov model reduces to a mixture distribution is presented. There is strong evidence of a correlation between the annual rainfall and climate indices. Strong evidence of persistence of one of these indices, the Pacific Decadal Oscillation (PDO), is presented together with a demonstration that this is better modelled by fractional differencing than by a hidden state Markov model. It is shown that conditioning the logarithm of rainfall on PDO, the Southern Oscillation index (SOI), and their interaction provides realistic simulation of rainfall that matches observed statistics. Similar simulation models are presented for Brisbane, Melbourne and Perth. Keywords: Hydrological persistence,hidden state Markov models, fractional differencing, PDO, SOI, Australian rainfall


Entropy ◽  
2019 ◽  
Vol 21 (6) ◽  
pp. 600 ◽  
Author(s):  
Jakob Schöttl ◽  
Michael Seitz ◽  
Gerta Köster

In pedestrian dynamics, individual-based models serve to simulate the behavior of crowds so that evacuation times and crowd densities can be estimated or the efficiency of public transportation optimized. Often, train systems are investigated where seat choice may have a great impact on capacity utilization, especially when passengers get in each other’s way. Therefore, it is useful to reproduce passengers’ behavior inside trains. However, there is surprisingly little research on the subject. Do passengers distribute evenly as it is most often assumed in simulation models and as one would expect from a system that obeys the laws of thermodynamics? Conversely, is there a higher degree of order? To answer these questions, we collect data on seating behavior in Munich’s suburban trains and analyze it. Clear preferences are revealed that contradict the former assumption of a uniform distribution. We subsequently introduce a model that matches the probability distributions we observed. We demonstrate the applicability of our model and present a qualitative validation with a simulation example. The model’s implementation is part of the free and open-source Vadere simulation framework for pedestrian dynamics and thus available for further studies. The model can be used as one component in larger systems for the simulation of public transport.


2006 ◽  
Vol 91 (10-11) ◽  
pp. 1310-1314 ◽  
Author(s):  
C.J. Pérez ◽  
J. Martín ◽  
M.J. Rufo

Author(s):  
Yan Wang

Variability is inherent randomness in systems, whereas uncertainty is due to lack of knowledge. In this paper, a generalized multiscale Markov (GMM) model is proposed to quantify variability and uncertainty simultaneously in multiscale system analysis. The GMM model is based on a new imprecise probability theory that has the form of generalized interval, which is a Kaucher or modal extension of classical set-based intervals to represent uncertainties. The properties of the new definitions of independence and Bayesian inference are studied. Based on a new Bayes’ rule with generalized intervals, three cross-scale validation approaches that incorporate variability and uncertainty propagation are also developed.


2013 ◽  
Vol 10 (01) ◽  
pp. 1350007
Author(s):  
FREDERICK BETZ

For economic and environmental policies aimed at developing sustainable economies, it is important to have a general modeling approach which can quantitatively connect economic processes with biological and physical processes of the environment. If economic processes cannot be measured as to their real physical/biological impacts, one does not know whether or not such economic processes are sustainable in nature. To create an integrated economic/environmental model, we have mathematically generalized the Leontief's input–output economic model — from vector notation to tensor notation. The use of tensor mathematics for input–output models of both an economy and its environment provides a data architecture to create simulation models of the environmental impact of an economy.


Author(s):  
Bethany Pickett ◽  
Cameron J. Turner ◽  
Anthony Petrella

Probabilistic simulation methods have allowed for many advancements in the field of biomechanics, especially for the human spine. To accurately model a complex system such as the spine, the model must account for the differences that occur from one specimen to the next. These differences in material properties and anatomical shapes are described probabilistically. Accurately modeling the effects of these differences is important in biomechanics as no two people are exactly alike, yet building individual models of every person is impractical. Several authors have conducted research into more accurate ways to model biomechanical systems such as the spine, however the computational expense of performing analysis and optimization with these probabilistic simulation models still remains an issue, particularly with respect to the underlying Monte Carlo simulations. The research described in this paper investigates the use of Non-Uniform Rational B-splines (NURBs) based metamodels to reduce the cost of expensive probabilistic simulation models of the spine for analysis and optimization. Metamodels are simply mathematical approximations of a model or in other words, a model of models. Metamodels are widely used to represent the behavior of complex systems based on limited data from the original system model. Metamodels are often more computationally efficient to store and analyze than the original system models which they approximate. Using a Functional Spinal Unit (FSU) Finite Element Model, two different probabilistic NURBs-based metamodeling methods were developed and tested. Through the use of metamodels, a promising approach for reducing the computational time of running a Monte Carlo simulation was discovered.


2010 ◽  
Vol 22 (6) ◽  
pp. 1573-1596 ◽  
Author(s):  
Arturo Berrones

A novel formalism for Bayesian learning in the context of complex inference models is proposed. The method is based on the use of the stationary Fokker-Planck (SFP) approach to sample from the posterior density. Stationary Fokker-Planck sampling generalizes the Gibbs sampler algorithm for arbitrary and unknown conditional densities. By the SFP procedure, approximate analytical expressions for the conditionals and marginals of the posterior can be constructed. At each stage of SFP, the approximate conditionals are used to define a Gibbs sampling process, which is convergent to the full joint posterior. By the analytical marginals efficient learning methods in the context of artificial neural networks are outlined. Offline and incremental Bayesian inference and maximum likelihood estimation from the posterior are performed in classification and regression examples. A comparison of SFP with other Monte Carlo strategies in the general problem of sampling from arbitrary densities is also presented. It is shown that SFP is able to jump large low-probability regions without the need of a careful tuning of any step-size parameter. In fact, the SFP method requires only a small set of meaningful parameters that can be selected following clear, problem-independent guidelines. The computation cost of SFP, measured in terms of loss function evaluations, grows linearly with the given model's dimension.


Sign in / Sign up

Export Citation Format

Share Document