SO-MI: A surrogate model algorithm for computationally expensive nonlinear mixed-integer black-box global optimization problems

2013 ◽  
Vol 40 (5) ◽  
pp. 1383-1400 ◽  
Author(s):  
Juliane Müller ◽  
Christine A. Shoemaker ◽  
Robert Piché
2018 ◽  
Vol 51 (2) ◽  
pp. 265-285 ◽  
Author(s):  
Abdulbaset Saad ◽  
Zuomin Dong ◽  
Brad Buckham ◽  
Curran Crawford ◽  
Adel Younis ◽  
...  

2019 ◽  
Vol 31 (4) ◽  
pp. 689-702 ◽  
Author(s):  
Juliane Müller ◽  
Marcus Day

We introduce the algorithm SHEBO (surrogate optimization of problems with hidden constraints and expensive black-box objectives), an efficient optimization algorithm that employs surrogate models to solve computationally expensive black-box simulation optimization problems that have hidden constraints. Hidden constraints are encountered when the objective function evaluation does not return a value for a parameter vector. These constraints are often encountered in optimization problems in which the objective function is computed by a black-box simulation code. SHEBO uses a combination of local and global search strategies together with an evaluability prediction function and a dynamically adjusted evaluability threshold to iteratively select new sample points. We compare the performance of our algorithm with that of the mesh-based algorithms mesh adaptive direct search (MADS, NOMAD [nonlinear optimization by mesh adaptive direct search] implementation) and implicit filtering and SNOBFIT (stable noisy optimization by branch and fit), which assigns artificial function values to points that violate the hidden constraints. Our numerical experiments for a large set of test problems with 2–30 dimensions and a 31-dimensional real-world application problem arising in combustion simulation show that SHEBO is an efficient solver that outperforms the other methods for many test problems.


Author(s):  
Laurens Bliek ◽  
Sicco Verwer ◽  
Mathijs de Weerdt

Abstract When a black-box optimization objective can only be evaluated with costly or noisy measurements, most standard optimization algorithms are unsuited to find the optimal solution. Specialized algorithms that deal with exactly this situation make use of surrogate models. These models are usually continuous and smooth, which is beneficial for continuous optimization problems, but not necessarily for combinatorial problems. However, by choosing the basis functions of the surrogate model in a certain way, we show that it can be guaranteed that the optimal solution of the surrogate model is integer. This approach outperforms random search, simulated annealing and a Bayesian optimization algorithm on the problem of finding robust routes for a noise-perturbed traveling salesman benchmark problem, with similar performance as another Bayesian optimization algorithm, and outperforms all compared algorithms on a convex binary optimization problem with a large number of variables.


2014 ◽  
Vol 136 (8) ◽  
Author(s):  
Stefanos Koullias ◽  
Dimitri N. Mavris

The design of unconventional systems requires early use of high-fidelity physics-based tools to search the design space for improved and potentially optimum designs. Current methods for incorporating these computationally expensive tools into early design for the purpose of reducing uncertainty are inadequate due to the limited computational resources that are available in early design. Furthermore, the lack of finite difference derivatives, unknown design space properties, and the possibility of code failures motivates the need for a robust and efficient global optimization (EGO) algorithm. A novel surrogate model-based global optimization algorithm capable of efficiently searching challenging design spaces for improved designs is presented. The algorithm, called fBcEGO for fully Bayesian constrained EGO, constructs a fully Bayesian Gaussian process (GP) model through a set of observations and then uses the model to make new observations in promising areas where improvements are likely to occur. This model remedies the inadequacies of likelihood-based approaches, which may provide an incomplete inference of the underlying function when function evaluations are expensive and therefore scarce. A challenge in the construction of the fully Bayesian GP model is the selection of the prior distribution placed on the model hyperparameters. Previous work employs static priors, which may not capture a sufficient number of interpretations of the data to make any useful inferences about the underlying function. An iterative method that dynamically assigns hyperparameter priors by exploiting the mechanics of Bayesian penalization is presented. fBcEGO is incorporated into a methodology that generates relatively few infeasible designs and provides large reductions in the objective function values of design problems. This new algorithm, upon implementation, was found to solve more nonlinearly constrained algebraic test problems to higher accuracies relative to the global minimum than other popular surrogate model-based global optimization algorithms and obtained the largest reduction in the takeoff gross weight objective function for the case study of a notional 70-passenger regional jet when compared with competing design methods.


Author(s):  
Liqun Wang ◽  
Songqing Shan ◽  
G. Gary Wang

The presence of black-box functions in engineering design, which are usually computation-intensive, demands efficient global optimization methods. This work proposes a new global optimization method for black-box functions. The global optimization method is based on a novel mode-pursuing sampling (MPS) method which systematically generates more sample points in the neighborhood of the function mode while statistically covers the entire search space. Quadratic regression is performed to detect the region containing the global optimum. The sampling and detection process iterates until the global optimum is obtained. Through intensive testing, this method is found to be effective, efficient, robust, and applicable to both continuous and discontinuous functions. It supports simultaneous computation and applies to both unconstrained and constrained optimization problems. Because it does not call any existing global optimization tool, it can be used as a standalone global optimization method for inexpensive problems as well. Limitation of the method is also identified and discussed.


Sign in / Sign up

Export Citation Format

Share Document