A probabilistic solution to the MEG inverse problem via MCMC methods: the reversible jump and parallel tempering algorithms

2001 ◽  
Vol 48 (5) ◽  
pp. 533-542 ◽  
Author(s):  
C. Bertrand ◽  
M. Ohmi ◽  
R. Suzuki ◽  
H. Kado
2010 ◽  
Vol 68 ◽  
pp. e332
Author(s):  
Taku Yoshioka ◽  
Ken-ichi Morishige ◽  
Mitsuo Kawato ◽  
Masa-aki Sato

Technometrics ◽  
2015 ◽  
Vol 57 (1) ◽  
pp. 123-137
Author(s):  
Siva Tian ◽  
Jianhua Z. Huang ◽  
Haipeng Shen

Author(s):  
Russell Cheng

Fitting a finite mixture model when the number of components, k, is unknown can be carried out using the maximum likelihood (ML) method though it is non-standard. Two well-known Bayesian Markov chain Monte Carlo (MCMC) methods are reviewed and compared with ML: the reversible jump method and one using an approximating Dirichlet process. Another Bayesian method, to be called MAPIS, is examined that first obtains point estimates for the component parameters by the maximum a posteriori method for different k and then estimates posterior distributions, including that for k, using importance sampling. MAPIS is compared with ML and the MCMC methods. The MCMC methods produce multimodal posterior parameter distributions in overfitted models. This results in the posterior distribution of k being biased towards high k. It is shown that MAPIS does not suffer from this problem. A simple numerical example is discussed.


NeuroImage ◽  
2006 ◽  
Vol 31 (2) ◽  
pp. 623-626 ◽  
Author(s):  
Gareth R. Barnes ◽  
Paul L. Furlong ◽  
Krish D. Singh ◽  
Arjan Hillebrand

Sign in / Sign up

Export Citation Format

Share Document