scholarly journals Bayesian computation for statistical models with intractable normalizing constants

2013 ◽  
Vol 27 (4) ◽  
pp. 416-436 ◽  
Author(s):  
Yves F. Atchadé ◽  
Nicolas Lartillot ◽  
Christian Robert
2013 ◽  
Vol 25 (8) ◽  
pp. 2199-2234 ◽  
Author(s):  
Faming Liang ◽  
Ick-Hoon Jin

Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. The MCMH algorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals.


2019 ◽  
Vol 6 (1) ◽  
pp. 379-403 ◽  
Author(s):  
Mark A. Beaumont

Many of the statistical models that could provide an accurate, interesting, and testable explanation for the structure of a data set turn out to have intractable likelihood functions. The method of approximate Bayesian computation (ABC) has become a popular approach for tackling such models. This review gives an overview of the method and the main issues and challenges that are the subject of current research.


Sign in / Sign up

Export Citation Format

Share Document