Statistical Surrogate Formulations for Simulation-Based Design Optimization

2015 ◽  
Vol 137 (2) ◽  
Author(s):  
Bastien Talgorn ◽  
Sébastien Le Digabel ◽  
Michael Kokkolaras

Typical challenges of simulation-based design optimization include unavailable gradients and unreliable approximations thereof, expensive function evaluations, numerical noise, multiple local optima, and the failure of the analysis to return a value to the optimizer. One possible remedy to alleviate these issues is to use surrogate models in lieu of the computational models or simulations and derivative-free optimization algorithms. In this work, we use the R dynaTree package to build statistical surrogates of the blackboxes and the direct search method for derivative-free optimization. We present different formulations for the surrogate problem (SP) considered at each search step of the mesh adaptive direct search (MADS) algorithm using a surrogate management framework. The proposed formulations are tested on 20 analytical benchmark problems and two simulation-based multidisciplinary design optimization (MDO) problems. Numerical results confirm that the use of statistical surrogates in MADS improves the efficiency of the optimization algorithm.

Author(s):  
Bastien Talgorn ◽  
Sébastien Le Digabel ◽  
Michael Kokkolaras

Typical challenges of simulation-based design optimization include unavailable gradients and unreliable approximations thereof, expensive function evaluations, numerical noise, multiple local optima and the failure of the analysis to return a value to the optimizer. The remedy for all these issues is to use surrogate models in lieu of the computational models or simulations and derivative-free optimization algorithms. In this work, we use the R dynaTree package to build statistical surrogates of the blackboxes and the direct search method for derivative-free optimization. We present different formulations for the surrogate problem considered at each search step of the Mesh Adaptive Direct Search (MADS) algorithm using a surrogate management framework. The proposed formulations are tested on two simulation-based multidisciplinary design optimization problems. Numerical results confirm that the use of statistical surrogates in MADS improves the efficiency of the optimization algorithm.


2019 ◽  
Vol 142 (2) ◽  
Author(s):  
Ahmed H. Bayoumy ◽  
Michael Kokkolaras

Abstract We consider the problem of selecting among different physics-based computational models of varying, and oftentimes not assessed, fidelity for evaluating the objective and constraint functions in numerical design optimization. Typically, higher-fidelity models are associated with higher computational cost. Therefore, it is desirable to employ them only when necessary. We introduce a relative adequacy framework that aims at determining whether lower-fidelity models (that are typically associated with lower computational cost) can be used in certain areas of the design space as the latter is being explored during the optimization process. We implement our approach by means of a trust-region management framework that utilizes the mesh adaptive direct search derivative-free optimization algorithm. We demonstrate the link between feasibility and fidelity and the key features of the proposed approach using two design optimization examples: a cantilever flexible beam subject to high accelerations and an airfoil in transonic flow conditions.


Author(s):  
Marcus Pettersson ◽  
Johan O¨lvander

Box’s Complex method for direct search has shown promise when applied to simulation based optimization. In direct search methods, like Box’s Complex method, the search starts with a set of points, where each point is a solution to the optimization problem. In the Complex method the number of points must be at least one plus the number of variables. However, in order to avoid premature termination and increase the likelihood of finding the global optimum more points are often used at the expense of the required number of evaluations. The idea in this paper is to gradually remove points during the optimization in order to achieve an adaptive Complex method for more efficient design optimization. The proposed method shows encouraging results when compared to the Complex method with fix number of points and a quasi-Newton method.


2021 ◽  
Author(s):  
Faruk Alpak ◽  
Yixuan Wang ◽  
Guohua Gao ◽  
Vivek Jain

Abstract Recently, a novel distributed quasi-Newton (DQN) derivative-free optimization (DFO) method was developed for generic reservoir performance optimization problems including well-location optimization (WLO) and well-control optimization (WCO). DQN is designed to effectively locate multiple local optima of highly nonlinear optimization problems. However, its performance has neither been validated by realistic applications nor compared to other DFO methods. We have integrated DQN into a versatile field-development optimization platform designed specifically for iterative workflows enabled through distributed-parallel flow simulations. DQN is benchmarked against alternative DFO techniques, namely, the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method hybridized with Direct Pattern Search (BFGS-DPS), Mesh Adaptive Direct Search (MADS), Particle Swarm Optimization (PSO), and Genetic Algorithm (GA). DQN is a multi-thread optimization method that distributes an ensemble of optimization tasks among multiple high-performance-computing nodes. Thus, it can locate multiple optima of the objective function in parallel within a single run. Simulation results computed from one DQN optimization thread are shared with others by updating a unified set of training data points composed of responses (implicit variables) of all successful simulation jobs. The sensitivity matrix at the current best solution of each optimization thread is approximated by a linear-interpolation technique using all or a subset of training-data points. The gradient of the objective function is analytically computed using the estimated sensitivities of implicit variables with respect to explicit variables. The Hessian matrix is then updated using the quasi-Newton method. A new search point for each thread is solved from a trust-region subproblem for the next iteration. In contrast, other DFO methods rely on a single-thread optimization paradigm that can only locate a single optimum. To locate multiple optima, one must repeat the same optimization process multiple times starting from different initial guesses for such methods. Moreover, simulation results generated from a single-thread optimization task cannot be shared with other tasks. Benchmarking results are presented for synthetic yet challenging WLO and WCO problems. Finally, DQN method is field-tested on two realistic applications. DQN identifies the global optimum with the least number of simulations and the shortest run time on a synthetic problem with known solution. On other benchmarking problems without a known solution, DQN identified compatible local optima with reasonably smaller numbers of simulations compared to alternative techniques. Field-testing results reinforce the auspicious computational attributes of DQN. Overall, the results indicate that DQN is a novel and effective parallel algorithm for field-scale development optimization problems.


2019 ◽  
Vol 145 (6) ◽  
pp. 3795-3804 ◽  
Author(s):  
Robin Tournemenne ◽  
Jean-François Petiot ◽  
Bastien Talgorn ◽  
Joël Gilbert ◽  
Michael Kokkolaras

2021 ◽  
Author(s):  
Muhammad Jalil Ahmad ◽  
Korhan Günel

This study gives a different numerical approach for solving second order differential equation with a Dirichlet boundary condition. Mesh Adaptive Direct Search (MADS) algorithm is adopted to train the feed forward neural network used in this approach. As MADS is a derivative-free optimization algorithm, it helps us to reduce the time-consuming workload in the training stage. The results obtained from this approach are also compared with Generalized Pattern Search (GPS) algorithm.


Sign in / Sign up

Export Citation Format

Share Document