A Filter-Based Sample Average SQP for Optimization Problems With Highly Nonlinear Probabilistic Constraints

2010 ◽  
Vol 132 (11) ◽  
Author(s):  
Kai-Shian Hsu ◽  
Kuei-Yuan Chan

In this work, we develop a filter-based sequential quadratic programming (SQP) algorithm for solving reliability-based design optimization (RBDO) problems with highly nonlinear constraints. The proposed filter-based SQP uses the approach of average importance sampling (AAIS) in calculating the values and gradients of probabilistic constraints. AAIS allocates samples at the limit state boundaries such that relatively few samples are required in calculating constraint probability values to achieve high accuracy and low variance. The accuracy of probabilistic constraint gradients using AAIS is improved by a sample filter that eliminates sample outliers that have low probability of occurrence and high gradient values. To ensure convergence, the algorithm uses an iteration filter in place of the penalty function to avoid the ill-conditioning problems of the penalty parameters in the acceptance of a design update. A sample reuse mechanism that improves the efficiency of the algorithm by avoiding redundant samples is introduced. The “unsampled” region, the region not covered by previous samples, is identified using iteration step lengths, the trust region, and constraint reliability levels. As a result, the filter-based sampling SQP efficiently handles highly nonlinear probabilistic constraints with multiple most probable points or functions without analytical forms. Several examples are demonstrated, and the results are compared with those from first order reliability method/second order reliability method and Monte Carlo simulations. Results show that by integrating the modified AAIS with the filter-based SQP, the overall computation cost of solving RBDO problems can be significantly reduced.

Author(s):  
Kai-Hsun Hsu ◽  
Kuei-Yuan Chan

In this work we extend a filter-based sequential quadratic programming (SQP) algorithm to solve reliability-based design optimization (RBDO) problems with highly nonlinear constraints. This filter-based SQP uses the approach of average importance sampling (AAIS) in calculating the values and the gradients of probabilistic constraints. AAIS allocates samples at the limit state boundaries such that relatively few samples are required in calculating constraint probability values to achieve high accuracy and low variance. The accuracy of probabilistic constraint gradients using AAIS is improved by a sample filter to eliminate sample outliers that have low probability of occurrence and high gradient values. To ensure convergence, this algorithm replaces the penalty function by an iteration filter to avoid the ill-conditioning problems of the penalty parameters in the acceptance of a design update. A sample-reuse mechanism is introduced to improve the efficiency of the algorithm by avoiding redundant samples. ‘Unsampled’ region, the region that is not covered by previous samples, is identified by the iteration step lengths, the trust region, and constraint reliability levels. As a result, this filter-based sampling SQP can efficiently handle highly nonlinear probabilistic constraints with multiple most probable points or functions without analytical forms. Several examples are demonstrated and compared with FORM/SORM and Monte Carlo simulation. Results show that by integrating the modified AAIS with the filter-based SQP, overall computation cost can be significantly improved in solving RBDO problems.


2018 ◽  
Vol 10 (9) ◽  
pp. 168781401879333 ◽  
Author(s):  
Zhiliang Huang ◽  
Tongguang Yang ◽  
Fangyi Li

Conventional decoupling approaches usually employ first-order reliability method to deal with probabilistic constraints in a reliability-based design optimization problem. In first-order reliability method, constraint functions are transformed into a standard normal space. Extra non-linearity introduced by the non-normal-to-normal transformation may increase the error in reliability analysis and then result in the reliability-based design optimization analysis with insufficient accuracy. In this article, a decoupling approach is proposed to provide an alternative tool for the reliability-based design optimization problems. To improve accuracy, the reliability analysis is performed by first-order asymptotic integration method without any extra non-linearity transformation. To achieve high efficiency, an approximate technique of reliability analysis is given to avoid calculating time-consuming performance function. Two numerical examples and an application of practical laptop structural design are presented to validate the effectiveness of the proposed approach.


2021 ◽  
Vol 11 (11) ◽  
pp. 5312
Author(s):  
Junho Chun

This paper proposes a reliability-based design optimization (RBDO) approach that adopts the second-order reliability method (SORM) and complex-step (CS) derivative approximation. The failure probabilities are estimated using the SORM, with Breitung’s formula and the technique established by Hohenbichler and Rackwitz, and their sensitivities are analytically derived. The CS derivative approximation is used to perform the sensitivity analysis based on derivations. Given that an imaginary number is used as a step size to compute the first derivative in the CS derivative method, the calculation stability and accuracy are enhanced with elimination of the subtractive cancellation error, which is commonly encountered when using the traditional finite difference method. The proposed approach unifies the CS approximation and SORM to enhance the estimation of the probability and its sensitivity. The sensitivity analysis facilitates the use of gradient-based optimization algorithms in the RBDO framework. The proposed RBDO/CS–SORM method is tested on structural optimization problems with a range of statistical variations. The results demonstrate that the performance can be enhanced while satisfying precisely probabilistic constraints, thereby increasing the efficiency and efficacy of the optimal design identification. The numerical optimization results obtained using different optimization approaches are compared to validate this enhancement.


2020 ◽  
Vol 142 (10) ◽  
Author(s):  
Hao Wu ◽  
Zhifu Zhu ◽  
Xiaoping Du

Abstract When limit-state functions are highly nonlinear, traditional reliability methods, such as the first-order and second-order reliability methods, are not accurate. Monte Carlo simulation (MCS), on the other hand, is accurate if a sufficient sample size is used but is computationally intensive. This research proposes a new system reliability method that combines MCS and the Kriging method with improved accuracy and efficiency. Accurate surrogate models are created for limit-state functions with minimal variance in the estimate of the system reliability, thereby producing high accuracy for the system reliability prediction. Instead of employing global optimization, this method uses MCS samples from which training points for the surrogate models are selected. By considering the autocorrelation of a surrogate model, this method captures the more accurate contribution of each MCS sample to the uncertainty in the estimate of the serial system reliability and therefore chooses training points efficiently. Good accuracy and efficiency are demonstrated by four examples.


2008 ◽  
Vol 131 (1) ◽  
Author(s):  
Gordon J. Savage ◽  
Young Kap Son

In this paper, we present a methodology that helps select the distribution parameters in degrading multiresponse systems to improve dependability at the lowest lifetime cost. The dependability measures include both quality (soft failures) and reliability (hard failures). Associated costs of scrap, rework, and warrantee work are included. The key to the approach is the fast and efficient creation of the system cumulative distribution function through a series of time-variant limit-state functions. Probabilities are evaluated by Monte Carlo simulation although the first-order reliability method is a viable alternative. The cost objective function that is common in reliability-based design optimization is expanded to include a lifetime loss of performance cost, herein based on present worth theory (also called present value theory). An optimum design in terms of distribution parameters of the design variables is found via a methodology that involves minimizing cost under performance policy constraints over the lifetime as the system degrades. A case study of an over-run clutch provides the insights and potential of the proposed methodology.


Author(s):  
Zhifu Zhu ◽  
Xiaoping Du

When limit-state functions are highly nonlinear, traditional reliability methods, such as the first order and second order reliability methods, are not accurate. Monte Carlo simulation (MCS), on the other hand, is accurate if a sufficient sample size is used, but is computationally intensive. This research proposes a new system reliability method that combines MCS and the Kriging method with improved accuracy and efficiency. Cheaper surrogate models are created for limit-state functions with the minimal variance in the estimate of the system reliability, thereby producing high accuracy for the system reliability prediction. Instead of employing global optimization, this method uses MCS samples from which training points for the surrogate models are selected. By considering the dependence between responses from a surrogate model, this method captures the true contribution of each MCS sample to the uncertainty in the estimate of the system reliability and therefore chooses training points efficiently. Good accuracy and efficiency are demonstrated by three examples.


2012 ◽  
Vol 134 (12) ◽  
Author(s):  
Zequn Wang ◽  
Pingfeng Wang

A primary concern in practical engineering design is ensuring high system reliability throughout a product's lifecycle, which is subject to time-variant operating conditions and component deteriorations. Thus, the capability of dealing with time-dependent probabilistic constraints in reliability-based design optimization (RBDO) is of vital importance in practical engineering design applications. This paper presents a nested extreme response surface (NERS) approach to efficiently carry out time-dependent reliability analysis and determine the optimal designs. This approach employs the kriging model to build a nested response surface of time corresponding to the extreme value of the limit state function. The efficient global optimization (EGO) technique is integrated with the NERS approach to extract the extreme time responses of the limit state function for any given system design. An adaptive response prediction and model maturation (ARPMM) mechanism is developed based on the mean square error (MSE) to concurrently improve the accuracy and computational efficiency of the proposed approach. With the nested response surface of time, the time-dependent reliability analysis can be converted into the time-independent reliability analysis, and existing advanced reliability analysis and design methods can be used. The NERS approach is compared with existing time-dependent reliability analysis approaches and integrated with RBDO for engineered system design with time-dependent probabilistic constraints. Two case studies are used to demonstrate the efficacy of the proposed NERS approach.


Author(s):  
Kyung K. Choi ◽  
Byeng D. Youn

Deterministic optimum designs that are obtained without consideration of uncertainty could lead to unreliable designs, which call for a reliability approach to design optimization, using a Reliability-Based Design Optimization (RBDO) method. A typical RBDO process iteratively carries out a design optimization in an original random space (X-space) and reliability analysis in an independent and standard normal random space (U-space). This process requires numerous nonlinear mapping between X- and U-spaces for a various probability distributions. Therefore, the nonlinearity of RBDO problem will depend on the type of distribution of random parameters, since a transformation between X- and U-spaces introduces additional nonlinearity to reliability-based performance measures evaluated during the RBDO process. Evaluation of probabilistic constraints in RBDO can be carried out in two different ways: the Reliability Index Approach (RIA) and the Performance Measure Approach (PMA). Different reliability analysis approaches employed in RIA and PMA result in different behaviors of nonlinearity of RIA and PMA in the RBDO process. In this paper, it is shown that RIA becomes much more difficult to solve for non-normally distributed random parameters because of highly nonlinear transformations involved. However, PMA is rather independent of probability distributions because of little involvement of the nonlinear transformation.


Sign in / Sign up

Export Citation Format

Share Document