Quasi-Monte Carlo Bayesian estimation under Besov priors in elliptic inverse problems

2020 ◽  
pp. 1
Author(s):  
Lukas Herrmann ◽  
Magdalena Keller ◽  
Christoph Schwab
2015 ◽  
Vol 25 (4) ◽  
pp. 727-737 ◽  
Author(s):  
Alexandros Beskos ◽  
Ajay Jasra ◽  
Ege A. Muzaffer ◽  
Andrew M. Stuart

2017 ◽  
Vol 27 (05) ◽  
pp. 953-995 ◽  
Author(s):  
Josef Dick ◽  
Robert N. Gantner ◽  
Quoc T. Le Gia ◽  
Christoph Schwab

We propose and analyze deterministic multilevel (ML) approximations for Bayesian inversion of operator equations with uncertain distributed parameters, subject to additive Gaussian measurement data. The algorithms use a ML approach based on deterministic, higher-order quasi-Monte Carlo (HoQMC) quadrature for approximating the high-dimensional expectations, which arise in the Bayesian estimators, and a Petrov–Galerkin (PG) method for approximating the solution to the underlying partial differential equation (PDE). This extends the previous single-level (SL) approach from [J. Dick, R. N. Gantner, Q. T. Le Gia and Ch. Schwab, Higher order quasi-Monte Carlo integration for Bayesian estimation, Report 2016-13, Seminar for Applied Mathematics, ETH Zürich (in review)]. Compared to the SL approach, the present convergence analysis of the ML method requires stronger assumptions on holomorphy and regularity of the countably-parametric uncertainty-to-observation maps of the forward problem. As in the SL case and in the affine-parametric case analyzed in [J. Dick, F. Y. Kuo, Q. T. Le Gia and Ch. Schwab, Multi-level higher order QMC Galerkin discretization for affine parametric operator equations, SIAM J. Numer. Anal. 54 (2016) 2541–2568], we obtain sufficient conditions which allow us to achieve arbitrarily high, algebraic convergence rates in terms of work, which are independent of the dimension of the parameter space. The convergence rates are limited only by the spatial regularity of the forward problem, the discretization order achieved by the PG discretization, and by the sparsity of the uncertainty parametrization. We provide detailed numerical experiments for linear elliptic problems in two space dimensions, with [Formula: see text] parameters characterizing the uncertain input, confirming the theory and showing that the ML HoQMC algorithms can outperform, in terms of error versus computational work, both multilevel Monte Carlo (MLMC) methods and SL HoQMC methods, provided the parametric solution maps of the forward problems afford sufficient smoothness and sparsity of the high-dimensional parameter spaces.


Energies ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 2328
Author(s):  
Mohammed Alzubaidi ◽  
Kazi N. Hasan ◽  
Lasantha Meegahapola ◽  
Mir Toufikur Rahman

This paper presents a comparative analysis of six sampling techniques to identify an efficient and accurate sampling technique to be applied to probabilistic voltage stability assessment in large-scale power systems. In this study, six different sampling techniques are investigated and compared to each other in terms of their accuracy and efficiency, including Monte Carlo (MC), three versions of Quasi-Monte Carlo (QMC), i.e., Sobol, Halton, and Latin Hypercube, Markov Chain MC (MCMC), and importance sampling (IS) technique, to evaluate their suitability for application with probabilistic voltage stability analysis in large-scale uncertain power systems. The coefficient of determination (R2) and root mean square error (RMSE) are calculated to measure the accuracy and the efficiency of the sampling techniques compared to each other. All the six sampling techniques provide more than 99% accuracy by producing a large number of wind speed random samples (8760 samples). In terms of efficiency, on the other hand, the three versions of QMC are the most efficient sampling techniques, providing more than 96% accuracy with only a small number of generated samples (150 samples) compared to other techniques.


Sign in / Sign up

Export Citation Format

Share Document