scholarly journals A Robust Solution to Variational Importance Sampling of Minimum Variance

Entropy ◽  
2020 ◽  
Vol 22 (12) ◽  
pp. 1405
Author(s):  
Jerónimo Hernández-González ◽  
Jesús Cerquides

Importance sampling is a Monte Carlo method where samples are obtained from an alternative proposal distribution. This can be used to focus the sampling process in the relevant parts of space, thus reducing the variance. Selecting the proposal that leads to the minimum variance can be formulated as an optimization problem and solved, for instance, by the use of a variational approach. Variational inference selects, from a given family, the distribution which minimizes the divergence to the distribution of interest. The Rényi projection of order 2 leads to the importance sampling estimator of minimum variance, but its computation is very costly. In this study with discrete distributions that factorize over probabilistic graphical models, we propose and evaluate an approximate projection method onto fully factored distributions. As a result of our evaluation it becomes apparent that a proposal distribution mixing the information projection with the approximate Rényi projection of order 2 could be interesting from a practical perspective.

2011 ◽  
Vol 133 (6) ◽  
Author(s):  
W. Hu ◽  
M. Li ◽  
S. Azarm ◽  
A. Almansoori

Many engineering optimization problems are multi-objective, constrained and have uncertainty in their inputs. For such problems it is desirable to obtain solutions that are multi-objectively optimum and robust. A robust solution is one that as a result of input uncertainty has variations in its objective and constraint functions which are within an acceptable range. This paper presents a new approximation-assisted MORO (AA-MORO) technique with interval uncertainty. The technique is a significant improvement, in terms of computational effort, over previously reported MORO techniques. AA-MORO includes an upper-level problem that solves a multi-objective optimization problem whose feasible domain is iteratively restricted by constraint cuts determined by a lower-level optimization problem. AA-MORO also includes an online approximation wherein optimal solutions from the upper- and lower-level optimization problems are used to iteratively improve an approximation to the objective and constraint functions. Several examples are used to test the proposed technique. The test results show that the proposed AA-MORO reasonably approximates solutions obtained from previous MORO approaches while its computational effort, in terms of the number of function calls, is significantly reduced compared to the previous approaches.


2019 ◽  
Vol 2019 ◽  
pp. 1-19
Author(s):  
NingNing Du ◽  
Yan-Kui Liu ◽  
Ying Liu

In financial optimization problem, the optimal portfolios usually depend heavily on the distributions of uncertain return rates. When the distributional information about uncertain return rates is partially available, it is important for investors to find a robust solution for immunization against the distribution uncertainty. The main contribution of this paper is to develop an ambiguous value-at-risk (VaR) optimization framework for portfolio selection problems, where the distributions of uncertain return rates are partially available. For tractability consideration, we deal with new safe approximations of ambiguous probabilistic constraints under two types of random perturbation sets and obtain two equivalent tractable formulations of the ambiguous probabilistic constraints. Finally, to demonstrate the potential for solving portfolio optimization problems, we provide a practical example about the Chinese stock market. The advantage of the proposed robust optimization method is also illustrated by comparing it with the existing optimization approach via numerical experiments.


Author(s):  
Ximing Li ◽  
Changchun Li ◽  
Jinjin Chi ◽  
Jihong Ouyang

Overdispersed black-box variational inference employs importance sampling to reduce the variance of the Monte Carlo gradient in black-box variational inference. A simple overdispersed proposal distribution is used. This paper aims to investigate how to adaptively obtain better proposal distribution for lower variance. To this end, we directly approximate the optimal proposal in theory using a Monte Carlo moment matching step at each variational iteration. We call this adaptive proposal moment matching proposal (MMP). Experimental results on two Bayesian models show that the MMP can effectively reduce variance in black-box learning, and perform better than baseline inference algorithms.


2004 ◽  
Vol 36 (02) ◽  
pp. 417-433 ◽  
Author(s):  
Maria De Iorio ◽  
Robert C. Griffiths

Stephens and Donnelly (2000) constructed an efficient sequential importance-sampling proposal distribution on coalescent histories of a sample of genes for computing the likelihood of a type configuration of genes in the sample. In the current paper a characterization of their importance-sampling proposal distribution is given in terms of the diffusion-process generator describing the distribution of the population gene frequencies. This characterization leads to a new technique for constructing importance-sampling algorithms in a much more general framework when the distribution of population gene frequencies follows a diffusion process, by approximating the generator of the process.


2012 ◽  
Vol 605-607 ◽  
pp. 2399-2404
Author(s):  
Xin Lai Chen ◽  
Song Shen

Represent a new Weapon-Target Assignment (WTA) model of warship fleet as to the characteristic of the modern naval battle field and the battle modality. This model considers the WTA to a multi-objects optimization problem, and a Fast and Elitist Non-Dominated Sorting Genetic Algorithm (FENSGA) is applied to resolve this model. The FENSGA can reach a set of wide-distributing, robust solution. One running of the FENSGA can reach a multi-Pareto solution, which the commander can select from. A simulation is given to prove the validity of this model and algorithm.


2010 ◽  
Vol 102-104 ◽  
pp. 301-305
Author(s):  
Yong Xian Li ◽  
Bin Wang ◽  
Guang Ping Peng

A new intelligent orthogonal optimization algorithm for robust design is proposed in order to improve accuracy and efficiency. The next searching direction and searching range of variables are determined by variance ratio after the robust optimization model is firstly calculated by design parameters on orthogonal array. New orthogonal array for further optimization is formed intelligently by analysis of variance ratio. The intelligent orthogonal optimization is performed until error value of each variable is equal to zero or is equivalent, which is the optimal robust solution. Correspondingly, the variable range corresponding to the minimum variance ratio in the orthogonal array in preceding step is the tolerance of the optimal robust solution, which means that there is no need for special tolerance design. This paper takes a cam profile as an example to perform robust design. The simulation results prove that the new intelligent algorithm for robust design has many advantages, such as less calculation time, higher speed, no exiting of prematurity of local circulation and slow convergence of global search.


Sign in / Sign up

Export Citation Format

Share Document