Sequential Monte Carlo Filtering with Gaussian Mixture Sampling

2019 ◽  
Vol 42 (9) ◽  
pp. 2069-2077 ◽  
Author(s):  
Sehyun Yun ◽  
Renato Zanetti
2013 ◽  
Vol 2013 ◽  
pp. 1-16 ◽  
Author(s):  
Xianghui Yuan ◽  
Feng Lian ◽  
Chongzhao Han

By integrating the cardinality balanced multitarget multi-Bernoulli (CBMeMBer) filter with the interacting multiple models (IMM) algorithm, an MM-CBMeMBer filter is proposed in this paper for tracking multiple maneuvering targets in clutter. The sequential Monte Carlo (SMC) method is used to implement the filter for generic multi-target models and the Gaussian mixture (GM) method is used to implement the filter for linear-Gaussian multi-target models. Then, the extended Kalman (EK) and unscented Kalman filtering approximations for the GM-MM-CBMeMBer filter to accommodate mildly nonlinear models are described briefly. Simulation results are presented to show the effectiveness of the proposed filter.


2017 ◽  
Vol 145 (7) ◽  
pp. 2533-2553 ◽  
Author(s):  
Andreas S. Stordal ◽  
Hans A. Karlsen

In high-dimensional dynamic systems, standard Monte Carlo techniques that asymptotically reproduce the posterior distribution are computationally too expensive. Alternative sampling strategies are usually applied and among these the ensemble Kalman filter (EnKF) is perhaps the most popular. However, the EnKF suffers from severe bias if the model under consideration is far from linear. Another class of sequential Monte Carlo methods is kernel-based Gaussian mixture filters, which reduce the bias but maintain the robustness of the EnKF. Although many hybrid methods have been introduced in recent years, not many have been analyzed theoretically. Here it is shown that the recently proposed adaptive Gaussian mixture filter can be formulated in a rigorous Bayesian framework and that the algorithm can be generalized to a broader class of interpolated kernel filters. Two parameters—the bandwidth of the kernel and a weight interpolation factor—determine the filter performance. The new formulation of the filter includes particle filters, EnKF, and kernel-based Gaussian mixture filters as special cases. Techniques from particle filter literature are used to calculate the asymptotic bias of the filter as a function of the parameters and to derive a central limit theorem. The asymptotic theory is then used to determine the parameters as a function of the sample size in a robust way such that the error norm vanishes asymptotically, whereas the normalized error is sample independent and bounded. The parameter choice is tested on the Lorenz 63 model, where it is shown that the error is smaller or equal to the EnKF and the optimal particle filter for a varying sample size.


Author(s):  
Edward P. Herbst ◽  
Frank Schorfheide

Dynamic stochastic general equilibrium (DSGE) models have become one of the workhorses of modern macroeconomics and are extensively used for academic research as well as forecasting and policy analysis at central banks. This book introduces readers to state-of-the-art computational techniques used in the Bayesian analysis of DSGE models. The book covers Markov chain Monte Carlo techniques for linearized DSGE models, novel sequential Monte Carlo methods that can be used for parameter inference, and the estimation of nonlinear DSGE models based on particle filter approximations of the likelihood function. The theoretical foundations of the algorithms are discussed in depth, and detailed empirical applications and numerical illustrations are provided. The book also gives invaluable advice on how to tailor these algorithms to specific applications and assess the accuracy and reliability of the computations. The book is essential reading for graduate students, academic researchers, and practitioners at policy institutions.


Sign in / Sign up

Export Citation Format

Share Document