Global Optimization Under Uncertainty and Uncertainty Quantification Applied to Tractor-Trailer Base Flaps

Author(s):  
Jacob A. Freeman ◽  
Christopher J. Roy

Using a global optimization evolutionary algorithm (EA), propagating aleatory and epistemic uncertainty within the optimization loop, and using computational fluid dynamics (CFD), this study determines a design for a 3D tractor-trailer base (back-end) drag reduction device that reduces the wind-averaged drag coefficient by 41% at 57 mph (92 km/h). Because it is optimized under uncertainty, this design is relatively insensitive to uncertain wind speed and direction and uncertain deflection angles due to mounting accuracy and static aeroelastic loading. The model includes five design variables with generous constraints, and this study additionally includes the uncertain effects on drag prediction due to truck speed and elevation, steady Reynolds-averaged Navier–Stokes (RANS) approximation, and numerical approximation. This study uses the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) optimization and uncertainty quantification (UQ) framework to interface the RANS flow solver, grid generator, and optimization algorithm. The computational model is a simplified full-scale tractor-trailer with flow at highway speed. For the optimized design, the estimate of total predictive uncertainty is +15/−42%; 8–10% of this uncertainty comes from model form (computation versus experiment); 3–7% from model input (wind speed and direction, flap angle, and truck speed); and +0.0/−28.5% from numerical approximation (due to the relatively coarse, 6 × 106 cell grid). Relative comparison of designs to the no-flaps baseline should have considerably less uncertainty because numerical error and input variation are nearly eliminated and model form differences are reduced. The total predictive uncertainty is also presented in the form of a probability box, which may be used to decide how to improve the model and reduce uncertainty.

Author(s):  
Bernhard Poethke ◽  
Stefan Völker ◽  
Konrad Vogeler

Abstract In the surrogate model-based optimization of turbine airfoils, often only the prediction values for objective and constraints are employed, without considering uncertainties in the prediction. This is also the case for multi-fidelity optimization strategies based on e.g. the Gappy-POD approach, in which results from analyses of different fidelities are incorporated. However, the consideration of uncertainties in global optimization has the advantage that a balanced coverage of the design space between unexplored regions and regions close to the current optimum takes place. This means that on the one hand regions are covered in which so far only a few sample points are present and thus a high degree of uncertainty exists (global exploration), and on the other hand regions with promising objective and constraint values are investigated (local exploitation). The genuine new contribution in this work is the quantification of the uncertainty of the multi-fidelity Gappy-POD method and an adapted optimization strategy based on it. The uncertainty quantification is based on the error of linear fitting of low-fidelity values to the POD basis and subsequent forward propagation to the high-fidelity values. The uncertainty quantification is validated for random airfoil designs in a design of experiment. Based on this, a global optimization strategy for constrained problems is presented, which is based on the well-known Efficient Global Optimization (EGO) strategy and the Feasible Expected Improvement criterion. This means that Kriging models are created for both the objective and the constraint values depending on the design variables that consider both the predictions and the uncertainties. This approach offers the advantage that existing and widely used programs or libraries can be used for multi-fidelity optimization that support the (single-fidelity) EGO algorithm. Finally, the method is demonstrated for an industrial test case. A comparison between a single-fidelity optimization and a multi-fidelity optimization is made, each with the EGO strategy. A coupling of 2D/3D simulations is used for multi-fidelity analyses. The proposed method achieves faster feasible members in the optimization, resulting in faster turn-around compared to the single-fidelity strategy.


2016 ◽  
Vol 138 (11) ◽  
Author(s):  
Piyush Pandita ◽  
Ilias Bilionis ◽  
Jitesh Panchal

Design optimization under uncertainty is notoriously difficult when the objective function is expensive to evaluate. State-of-the-art techniques, e.g., stochastic optimization or sampling average approximation, fail to learn exploitable patterns from collected data and require a lot of objective function evaluations. There is a need for techniques that alleviate the high cost of information acquisition and select sequential simulations optimally. In the field of deterministic single-objective unconstrained global optimization, the Bayesian global optimization (BGO) approach has been relatively successful in addressing the information acquisition problem. BGO builds a probabilistic surrogate of the expensive objective function and uses it to define an information acquisition function (IAF) that quantifies the merit of making new objective evaluations. In this work, we reformulate the expected improvement (EI) IAF to filter out parametric and measurement uncertainties. We bypass the curse of dimensionality, since the method does not require learning the response surface as a function of the stochastic parameters, and we employ a fully Bayesian interpretation of Gaussian processes (GPs) by constructing a particle approximation of the posterior of its hyperparameters using adaptive Markov chain Monte Carlo (MCMC) to increase the methods robustness. Also, our approach quantifies the epistemic uncertainty on the location of the optimum and the optimal value as induced by the limited number of objective evaluations used in obtaining it. We verify and validate our approach by solving two synthetic optimization problems under uncertainty and demonstrate it by solving the oil-well placement problem (OWPP) with uncertainties in the permeability field and the oil price time series.


2017 ◽  
Vol 73 ◽  
pp. 137-161 ◽  
Author(s):  
C.T. Nitschke ◽  
P. Cinnella ◽  
D. Lucor ◽  
J.-C. Chassaing

2021 ◽  
pp. 1-35
Author(s):  
Rishabh Singh ◽  
Jose C. Principe

This letter introduces a new framework for quantifying predictive uncertainty for both data and models that rely on projecting the data into a gaussian reproducing kernel Hilbert space (RKHS) and transforming the data probability density function (PDF) in a way that quantifies the flow of its gradient as a topological potential field quantified at all points in the sample space. This enables the decomposition of the PDF gradient flow by formulating it as a moment decomposition problem using operators from quantum physics, specifically Schrödinger's formulation. We experimentally show that the higher-order modes systematically cluster the different tail regions of the PDF, thereby providing unprecedented discriminative resolution of data regions having high epistemic uncertainty. In essence, this approach decomposes local realizations of the data PDF in terms of uncertainty moments. We apply this framework as a surrogate tool for predictive uncertainty quantification of point-prediction neural network models, overcoming various limitations of conventional Bayesian-based uncertainty quantification methods. Experimental comparisons with some established methods illustrate performance advantages that our framework exhibits.


Author(s):  
Piyush Pandita ◽  
Ilias Bilionis ◽  
Jitesh Panchal

Design optimization under uncertainty is notoriously difficult when the objective function is expensive to evaluate. State-of-the-art techniques, e.g., stochastic optimization or sampling average approximation, fail to learn exploitable patterns from collected data and, as a result, they tend to require an excessive number of objective function evaluations. There is a need for techniques that alleviate the high cost of information acquisition and select sequential simulations in an optimal way. In the field of deterministic single-objective unconstrained global optimization, the Bayesian global optimization (BGO) approach has been relatively successful in addressing the information acquisition problem. BGO builds a probabilistic surrogate of the expensive objective function and uses it to define an information acquisition function (IAF) whose role is to quantify the merit of making new objective evaluations. Specifically, BGO iterates between making the observations with the largest expected IAF and rebuilding the probabilistic surrogate, until a convergence criterion is met. In this work, we extend the expected improvement (EI) IAF to the case of design optimization under uncertainty. This involves a reformulation of the EI policy that is able to filter out parametric and measurement uncertainties. We by-pass the curse of dimensionality, since the method does not require learning the response surface as a function of the stochastic parameters. To increase the robustness of our approach in the low sample regime, we employ a fully Bayesian interpretation of Gaussian processes by constructing a particle approximation of the posterior of its hyperparameters using adaptive Markov chain Monte Carlo. An addendum of our approach is that it can quantify the epistemic uncertainty on the location of the optimum and the optimal value as induced by the limited number of objective evaluations used in obtaining it. We verify and validate our approach by solving two synthetic optimization problems under uncertainty. We demonstrate our approach by solving a challenging engineering problem: the oil-well-placement problem with uncertainties in the permeability field and the oil price time series.


2009 ◽  
Vol 626-627 ◽  
pp. 693-698
Author(s):  
Yong Yong Zhu ◽  
S.Y. Gao

Dynamic balance of the spatial engine is researched. By considering the special wobble-plate engine as the model of spatial RRSSC linkages, design variables on the engine structure are confirmed based on the configuration characters and kinetic analysis of wobble-plate engine. In order to control the vibration of the engine frame and to decrease noise caused by the spatial engine, objective function is choosed as the dimensionless combinations of the various shaking forces and moments, the restriction condition of which presents limiting the percent of shaking moment. Then the optimization design is investigated by the mathematical model for dynamic balance. By use of the optimization design method to a type of wobble-plate engine, the optimization process as an example is demonstrated, it shows that the optimized design method benefits to control vibration and noise on the engines and improve the performance practically and theoretically.


Author(s):  
Alessandra Cuneo ◽  
Alberto Traverso ◽  
Shahrokh Shahpar

In engineering design, uncertainty is inevitable and can cause a significant deviation in the performance of a system. Uncertainty in input parameters can be categorized into two groups: aleatory and epistemic uncertainty. The work presented here is focused on aleatory uncertainty, which can cause natural, unpredictable and uncontrollable variations in performance of the system under study. Such uncertainty can be quantified using statistical methods, but the main obstacle is often the computational cost, because the representative model is typically highly non-linear and complex. Therefore, it is necessary to have a robust tool that can perform the uncertainty propagation with as few evaluations as possible. In the last few years, different methodologies for uncertainty propagation and quantification have been proposed. The focus of this study is to evaluate four different methods to demonstrate strengths and weaknesses of each approach. The first method considered is Monte Carlo simulation, a sampling method that can give high accuracy but needs a relatively large computational effort. The second method is Polynomial Chaos, an approximated method where the probabilistic parameters of the response function are modelled with orthogonal polynomials. The third method considered is Mid-range Approximation Method. This approach is based on the assembly of multiple meta-models into one model to perform optimization under uncertainty. The fourth method is the application of the first two methods not directly to the model but to a response surface representing the model of the simulation, to decrease computational cost. All these methods have been applied to a set of analytical test functions and engineering test cases. Relevant aspects of the engineering design and analysis such as high number of stochastic variables and optimised design problem with and without stochastic design parameters were assessed. Polynomial Chaos emerges as the most promising methodology, and was then applied to a turbomachinery test case based on a thermal analysis of a high-pressure turbine disk.


Sign in / Sign up

Export Citation Format

Share Document