A Global High Dimensional Metamodeling Approach With the Ability of Using Non-Uniform Sampling

Author(s):  
Kambiz Haji Hajikolaei ◽  
G. Gary Wang

In engineering design, spending excessive amount of time on physical experiments or expensive simulations makes the design costly and lengthy. This issue exacerbates when the design problem has a large number of inputs, or of high dimension. High Dimensional Model Representation (HDMR) is one powerful method in approximating high dimensional, expensive, black-box (HEB) problems. One existing HDMR implementation, Random Sampling HDMR (RS-HDMR), can build a HDMR model from random sample points with a linear combination of basis functions. The most critical issue in RS-HDMR is that calculating the coefficients for the basis functions includes integrals that are approximated by Monte Carlo summations, which are error prone with limited samples and especially with non-uniform sampling. In this paper, a new approach based on Principal Component Analysis (PCA), called PCA-HDMR, is proposed for finding the coefficients that provide the best linear combination of the bases with minimum error and without using any integral. Benchmark problems are modeled using the method and the results are compared with RS-HDMR results. With both uniform and non-uniform sampling, PCA-HDMR built more accurate models than RS-HDMR for a given set of sample points.

2013 ◽  
Vol 136 (1) ◽  
Author(s):  
Kambiz Haji Hajikolaei ◽  
G. Gary Wang

In engineering design, spending excessive amount of time on physical experiments or expensive simulations makes the design costly and lengthy. This issue exacerbates when the design problem has a large number of inputs, or of high dimension. High dimensional model representation (HDMR) is one powerful method in approximating high dimensional, expensive, black-box (HEB) problems. One existing HDMR implementation, random sampling HDMR (RS-HDMR), can build an HDMR model from random sample points with a linear combination of basis functions. The most critical issue in RS-HDMR is that calculating the coefficients for the basis functions includes integrals that are approximated by Monte Carlo summations, which are error prone with limited samples and especially with nonuniform sampling. In this paper, a new approach based on principal component analysis (PCA), called PCA-HDMR, is proposed for finding the coefficients that provide the best linear combination of the bases with minimum error and without using any integral. Several benchmark problems of different dimensionalities and one engineering problem are modeled using the method and the results are compared with RS-HDMR results. In all problems with both uniform and nonuniform sampling, PCA-HDMR built more accurate models than RS-HDMR for a given set of sample points.


2015 ◽  
Vol 138 (2) ◽  
Author(s):  
Kambiz Haji Hajikolaei ◽  
George H. Cheng ◽  
G. Gary Wang

The recently developed metamodel-based decomposition strategy relies on quantifying the variable correlations of black-box functions so that high-dimensional problems are decomposed to smaller subproblems, before performing optimization. Such a two-step method may miss the global optimum due to its rigidity or requires extra expensive sample points for ensuring adequate decomposition. This work develops a strategy to iteratively decompose high-dimensional problems within the optimization process. The sample points used during the optimization are reused to build a metamodel called principal component analysis-high dimensional model representation (PCA-HDMR) for quantifying the intensities of variable correlations by sensitivity analysis. At every iteration, the predicted intensities of the correlations are updated based on all the evaluated points, and a new decomposition scheme is suggested by omitting the weak correlations. Optimization is performed on the iteratively updated subproblems from decomposition. The proposed strategy is applied for optimization of different benchmarks and engineering problems, and results are compared to direct optimization of the undecomposed problems using trust region mode pursuing sampling method (TRMPS), genetic algorithm (GA), cooperative coevolutionary algorithm with correlation-based adaptive variable partitioning (CCEA-AVP), and divide rectangles (DIRECT). The results show that except for the category of undecomposable problems with all or many strong (i.e., important) correlations, the proposed strategy effectively improves the accuracy of the optimization results. The advantages of the new strategy in comparison with the previous methods are also discussed.


Author(s):  
Kambiz Haji Hajikolaei ◽  
G. Gary Wang

High Dimensional Model Representation (HDMR) is a tool for generating an approximation of an input-output model for a multivariate function. It can be used to model a black-box function for metamodel-based optimization. Recently the authors’ team has developed a radial basis function based HDMR (RBF-HDMR) model that can efficiently model a high dimensional black-box function and, moreover, to uncover inner variable structures of the black-box function. This approach, however, requests a complete new, although optimized, set of sample points, as dictated by the methodology, while in engineering design practice one often has many existing sample data. How to utilize the existing data to efficiently construct a HDMR model is the focus of this paper. We first identify the Random-Sampling HDMR (RS-HDMR), which uses orthonormal basis functions as HDMR component functions and existing sample points can be used to calculate the coefficients of the basis functions. One of the important issues related to the RS-HDMR is that in theory the basis functions are obtained based on the continuous integrations related to the orthonormality conditions. In practice, however, the integrations are approximated by Monte Carlo summation and thus the basis functions may not satisfy the orthonormality conditions. In this paper, we propose new and adaptive orthonormal basis functions with respect to a given set of sample points for RS-HDMR approximation. RS-HDMR models are built for different test functions using the standard and new adaptive basis functions for different number of sample points. The relative errors for both models are calculated and compared. The results show that the models that are built using the new basis functions are more accurate.


Author(s):  
Jesper Kristensen ◽  
Isaac Asher ◽  
Liping Wang

Gaussian Process (GP) regression is a well-established probabilistic meta-modeling and data analysis tool. The posterior distribution of the GP parameters can be estimated using, e.g., Markov Chain Monte Carlo (MCMC). The ability to make predictions is a key aspect of using such surrogate models. To make a GP prediction, the MCMC chain as well as the training data are required. For some applications, GP predictions can require too much computational time and/or memory, especially for many training data points. This motivates the present work to represent the GP in an equivalent polynomial (or other global functional) form called a portable GP. The portable GP inherits many benefits of the GP including feature ranking via Sobol indices, robust fitting to non-linear and high-dimensional data, accurate uncertainty estimates, etc. The framework expands the GP in a high-dimensional model representation (HDMR). After fitting each HDMR basis function with a polynomial, they are all added together to form the portable GP. A ranking of which basis functions to use in the fitting process is automatically provided via Sobol indices. The uncertainty from the fitting process can be propagated to the final GP polynomial estimate. In applications where speed and accuracy are paramount, spline fits to the basis functions give very good results. Finally, portable BHM provides an alternative set of assumptions with regards to extrapolation behavior which may be more appropriate than the assumptions inherent in GPs.


Author(s):  
A. Safari ◽  
K. H. Hajikolaei ◽  
H. G. Lemu ◽  
G. G. Wang

Although metaheuristic techniques have recently become popular in optimization, still they are not suitable for computationally expensive real-world problems, specifically when the problems have many input variables. Among these techniques, particle swarm optimization (PSO) is one of the most well-known population-based nature-inspired algorithms which can intelligently search huge spaces of possible arrangements of design variables to solve various complex problems. The candidate solutions and accordingly the required number of evaluated particles, however, dramatically increase with the number of design variables or the dimension of the problem. This study is a major modification to an original PSO for using all previously evaluated points aiming to increase the computational efficiency. For this purpose, a metamodeling methodology appropriate for so-called high-dimensional, expensive, black-box (HEB) problems is used to efficiently generate an approximate function from all particles calculated during the optimization process. Following the metamodel construction, a term named metamodeling acceleration is added to the velocity update formula in the original PSO algorithm using the minimum of the metamodel. The proposed strategy is called the metamodel guided particle swarm optimization (MGPSO) algorithm. The superior performance of the approach is compared with original PSO using several benchmark problems with different numbers of variables. The developed algorithm is then used to optimize the aerodynamic design of a gas turbine compressor blade airfoil as a challenging HEB problem. The simulation results illustrated the MGPSO’s capability to achieve more accurate results with a considerably smaller number of function evaluations.


Energies ◽  
2020 ◽  
Vol 13 (14) ◽  
pp. 3520 ◽  
Author(s):  
Hang Li ◽  
Zhe Zhang ◽  
Xianggen Yin

Because the penetration level of renewable energy sources has increased rapidly in recent years, uncertainty in power system operation is gradually increasing. As an efficient tool for power system analysis under uncertainty, probabilistic power flow (PPF) is becoming increasingly important. The point-estimate method (PEM) is a well-known PPF algorithm. However, two significant defects limit the practical use of this method. One is that the PEM struggles to estimate high-order moments accurately; this defect makes it difficult for the PEM to describe the distribution of non-Gaussian output random variables (ORVs). The other is that the calculation burden is strongly related to the scale of input random variables (IRVs), which makes the PEM difficult to use in large-scale power systems. A novel approach based on principal component analysis (PCA) and high-dimensional model representation (HDMR) is proposed here to overcome the defects of the traditional PEM. PCA is applied to decrease the dimension scale of IRVs and eliminate correlations. HDMR is applied to estimate the moments of ORVs. Because HDMR considers the cooperative effects of IRVs, it has a significantly smaller estimation error for high-order moments in particular. Case studies show that the proposed method can achieve a better performance in terms of accuracy and efficiency than traditional PEM.


Author(s):  
Anna G. Matveeva ◽  
Victoria N. Syryamina ◽  
Vyacheslav M. Nekrasov ◽  
Michael K. Bowman

Non-uniform schemes for collection of pulse dipole spectroscopy data can decrease and redistribute noise in the distance spectrum for increased sensitivity and throughput.


Mathematics ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 1262
Author(s):  
Sunil Kumar Mishra ◽  
Amitkumar V. Jha ◽  
Vijay Kumar Verma ◽  
Bhargav Appasani ◽  
Almoataz Y. Abdelaziz ◽  
...  

This paper presents an optimized algorithm for event-triggered control (ETC) of networked control systems (NCS). Initially, the traditional backstepping controller is designed for a generalized nonlinear plant in strict-feedback form that is subsequently extended to the ETC. In the NCS, the controller and the plant communicate with each other using a communication network. In order to minimize the bandwidth required, the number of samples to be sent over the communication channel should be reduced. This can be achieved using the non-uniform sampling of data. However, the implementation of non-uniform sampling without a proper event triggering rule might lead the closed-loop system towards instability. Therefore, an optimized event triggering algorithm has been designed such that the system states are always forced to remain in stable trajectory. Additionally, the effect of ETC on the stability of backstepping control has been analyzed using the Lyapunov stability theory. Two case studies on an inverted pendulum system and single-link robot system have been carried out to demonstrate the effectiveness of the proposed ETC in terms of system states, control effort and inter-event execution time.


Sign in / Sign up

Export Citation Format

Share Document