A System Uncertainty Propagation Approach With Model Uncertainty Quantification in Multidisciplinary Design

Author(s):  
Zhen Jiang ◽  
Wei Li ◽  
Daniel W. Apley ◽  
Wei Chen

The performance of a multidisciplinary system is inevitably affected by various sources of uncertainties, usually categorized as aleatory (e.g. input variability) or epistemic (e.g. model uncertainty) uncertainty. In the framework of design under uncertainty, all sources of uncertainties should be aggregated to assess the uncertainty of system quantities of interest (QOIs). In a multidisciplinary design system, uncertainty propagation refers to the analysis that quantifies the overall uncertainty of system QOIs resulting from all sources of aleatory and epistemic uncertainty originating in the individual disciplines. However, due to the complexity of multidisciplinary simulation, especially the coupling relationships between individual disciplines, many uncertainty propagation approaches in the existing literature only consider aleatory uncertainty and ignore the impact of epistemic uncertainty. In this paper, we address the issue of efficient uncertainty quantification of system QOIs considering both aleatory and epistemic uncertainties. We propose a spatial-random-process (SRP) based multidisciplinary uncertainty analysis (MUA) method that, subsequent to SRP-based disciplinary model uncertainty quantification, fully utilizes the structure of SRP emulators and leads to compact analytical formulas for assessing statistical moments of uncertain QOIs. The proposed method is applied to a benchmark electronics packaging problem. To demonstrate the effectiveness of the method, the estimated low-order statistical moments of the QOIs are compared to the results from Monte Carlo simulations.

2015 ◽  
Vol 137 (10) ◽  
Author(s):  
Zhen Jiang ◽  
Wei Li ◽  
Daniel W. Apley ◽  
Wei Chen

The performance of a multidisciplinary system is inevitably affected by various sources of uncertainties, usually categorized as aleatory (e.g., input variability) or epistemic (e.g., model uncertainty) uncertainty. In the framework of design under uncertainty, all sources of uncertainties should be aggregated to assess the uncertainty of system quantities of interest (QOIs). In a multidisciplinary design system, uncertainty propagation (UP) refers to the analysis that quantifies the overall uncertainty of system QOIs resulting from all sources of aleatory and epistemic uncertainty originating in the individual disciplines. However, due to the complexity of multidisciplinary simulation, especially the coupling relationships between individual disciplines, many UP approaches in the existing literature only consider aleatory uncertainty and ignore the impact of epistemic uncertainty. In this paper, we address the issue of efficient uncertainty quantification of system QOIs considering both aleatory and epistemic uncertainties. We propose a spatial-random-process (SRP) based multidisciplinary uncertainty analysis (MUA) method that, subsequent to SRP-based disciplinary model uncertainty quantification, fully utilizes the structure of SRP emulators and leads to compact analytical formulas for assessing statistical moments of uncertain QOIs. The proposed method is applied to a benchmark electronic packaging design problem. The estimated low-order statistical moments of the QOIs are compared to the results from Monte Carlo simulations (MCSs) to demonstrate the effectiveness of the method. The UP result is then used to facilitate the robust design optimization of the electronic packaging system.


2016 ◽  
Vol 138 (8) ◽  
Author(s):  
Zhen Jiang ◽  
Shishi Chen ◽  
Daniel W. Apley ◽  
Wei Chen

Model uncertainty is a significant source of epistemic uncertainty that affects the prediction of a multidisciplinary system. In order to achieve a reliable design, it is critical to ensure that the disciplinary/subsystem simulation models are trustworthy, so that the aggregated uncertainty of system quantities of interest (QOIs) is acceptable. Reduction of model uncertainty can be achieved by gathering additional experiments and simulations data; however, resource allocation for multidisciplinary design optimization (MDO) and analysis remains a challenging task due to the complex structure of the system, which involves decision makings about where (sampling locations), what (disciplinary responses), and which type (simulations versus experiments) for allocating more resources. Instead of trying to concurrently make the above decisions, which would be generally intractable, we develop a novel approach in this paper to break the decision making into a sequential procedure. First, a multidisciplinary uncertainty analysis (MUA) is developed to identify the input settings with unacceptable amounts of uncertainty with respect to the system QOIs. Next, a multidisciplinary statistical sensitivity analysis (MSSA) is developed to investigate the relative contributions of (functional) disciplinary responses to the uncertainty of system QOIs. The input settings and critical responses to allocate resources are selected based on the results from MUA and MSSA, with the aid of a new correlation analysis derived from spatial-random-process (SRP) modeling concepts, ensuring the sparsity of the selected inputs. Finally, an enhanced preposterior analysis predicts the effectiveness of allocating experimental and/or computational resource to answer the question about which type of resource to allocate. The proposed method is applied to a benchmark electronic packaging problem to demonstrate how epistemic model uncertainty is gradually reduced via resource allocation for data gathering.


Author(s):  
Zhen Jiang ◽  
Shishi Chen ◽  
Daniel W. Apley ◽  
Wei Chen

Epistemic model uncertainty is a significant source of uncertainty that affects a multidisciplinary system. In order to achieve a reliable design, it is critical to ensure that the disciplinary/subsystem simulation models are trustworthy, so that the aggregated uncertainty of system quantities of interest (QOIs) is acceptable. Uncertainty reduction can be achieved by gathering additional experiments and simulations data; however resource allocation for multidisciplinary design optimization (MDO) remains a challenging task due to the complex structure of a multidisciplinary system. In this paper, we develop a novel approach by integrating multidisciplinary uncertainty analysis (MUA) and multidisciplinary statistical sensitivity analysis (MSSA) to answer the questions about where (sampling locations), what (disciplinary responses), and which (simulations versus experiments) for allocating more resources. To manage the complexity in making the above decisions, a sequential procedure is proposed. First, the input space of a multidiscipline system is explored to identify the locations with unacceptable amounts of uncertainty with respect to the system QOIs. Next, these input locations are selected through a correlation check so that they are sparsely located in the input space, and their corresponding critical responses are identified based on MSSA. Finally, using a preposterior analysis, decisions are made about what type of resources (experimental or computational) should be allocated to the critical responses at the chosen input locations. The proposed method is applied to a benchmark electronic packaging problem to demonstrate how epistemic uncertainty is gradually reduced via gathering more data.


Author(s):  
Farooq Akram ◽  
Matthew A. Prior ◽  
Dimitri N. Mavris

Accurate technology modeling is a challenge, especially when it comes to revolutionary concepts. Absence of historical trends and experimental data for these concepts make it harder to predict precise effects. This situation makes it imperative to make use of subject area expert elicitation. This knowledge generally comes with subjective opinion about impact of technology on performance and market related metrics. These opinions from multiple subject matter experts may vary depending upon their past experiences and personalized preferences. In order to cater to difference of opinion from experts, uncertainty quantification on these inputs and its propagation to the performance and marketing metrics is very important. In addition to input uncertainties, technology interactions play a vital role when multiple technologies are selected simultaneously. There are some processes already in practice to deal with these interactions. These interactions and incompatibilities are currently modeled through technology impact matrices (TIM) and technology compatibility matrices (TCM). It however requires some further refinement. Generally these processes assume the impact of these technologies to be additive when portfolio of technologies are applied. In reality, these technologies are not additive in nature. This problem is addressed through introduction of Technology Synergy Matrices (TSM). In this paper, an evidence theory based TIM and TSM process is demonstrated within the context of an aircraft engine design problem. A representative set of candidate technologies and impacts are provided as examples. Once a combination of technologies is selected, an uncertainty propagation approach is used to evaluate the range of potential effects of the system. In the end, results are compared with those obtained from deterministic approach. The TSM, when used in conjunction with TIM, offers more accurate quantification of technology interactions and allow for technology nonlinearities. At the same time, uncertainty quantification enables the designer to capture the probabilistic Pareto-frontiers that allow the designer to select robust portfolio of technologies. This eliminates unnecessary assumptions while constructing deterministic TIM. Comparison of results from proposed methodologies with deterministic approach shows the design space previously unexplored due to limitations of existing practices.


2020 ◽  
Vol 2020 ◽  
pp. 1-18
Author(s):  
Juan Zhang ◽  
Junping Yin ◽  
Ruili Wang

Since 2000, the research of uncertainty quantification (UQ) has been successfully applied in many fields and has been highly valued and strongly supported by academia and industry. This review firstly discusses the sources and the types of uncertainties and gives an overall discussion on the goal, practical significance, and basic framework of the research of UQ. Then, the core ideas and typical methods of several important UQ processes are introduced, including sensitivity analysis, uncertainty propagation, model calibration, Bayesian inference, experimental design, surrogate model, and model uncertainty analysis.


Author(s):  
Chen Guoqiang ◽  
Tan Jianping ◽  
Tao Yourui

Uncertainties, including aleatory and epistemic uncertainties, always exist in multidisciplinary system. Due to the discontinuous nature of epistemic uncertainty and the complex coupled relation among subsystems, the computational efficiency of reliability-based multidisciplinary design optimization (RBMDO) with mixed aleatory and epistemic uncertainties is extremely low. A novel RBMDO procedure is presented in this paper based on combined probability theory and evidence theory (ET) to deal with hybrid-uncertainties and improve the computational efficiency. Firstly, based on Bayes method, a novel method to define the probability density function of the aleatory variables is proposed. Secondly, the conventional equivalent normal method (J-C method) is modified to reliability analysis with hybrid-uncertainties. Finally, a novel RBMDO procedure is suggested by integrating the modified J-C method into the frame of sequence optimization and reliability analysis (SORA). Numerical examples and engineering example are applied to demonstrate the performance of the proposed method. The examples show the excellence of the RBMDO method both in computational efficiency and accuracy. The proposed method provides a practical and effective reliability design method for multidisciplinary system.


Author(s):  
Katia Bachi ◽  
Karim Abbas ◽  
Bernd Heidergott

In this paper we develop a new Taylor series expansion method for computing model output metrics under epistemic uncertainty in the model input parameters. Specifically, we compute the expected value and the variance of the stationary distribution associated with Markov reliability models. In the multi-parameter case, our approach allows to analyze the impact of correlation between the uncertainty on the individual parameters the model output metric. In addition, we also approximate true risk by using the Chebyshev' inequality. Numerical results are presented and compared to the corresponding Monte Carlo simulations ones.


2016 ◽  
Vol 866 ◽  
pp. 25-30
Author(s):  
He Sheng Tang ◽  
Jia He Mei ◽  
Wei Chen ◽  
Da Wei Li ◽  
Song Tao Xue

Various sources of uncertainty exist in concrete fatigue life prediction, such as variability in loading conditions, material parameters, experimental data and model uncertainty. In this article, the uncertainty model of concrete fatigue life prediction based on the S-N curve is built, and the evidence theory method is presented for uncertainty analysis in fatigue life prediction of concrete while considering the epistemic uncertainty of the parameter of the model. Based on the experimental of concrete four-point bending beams, the evidence theory method is applied to quantify the epistemic uncertainty stem from experimental data and model uncertainty. To improve the efficiency of computation, a method of differential evolution is adopted to speedup the works of uncertainty propagation. The efficiency and feasibility of the proposed approach are verified through a comparative analysis of probability theory.


2021 ◽  
Vol 247 ◽  
pp. 10002
Author(s):  
V. Vallet ◽  
J. Huyghe ◽  
C. Vaglio-Gaudard ◽  
D. Lecarpentier ◽  
C. Reynard-Carette

Currently there is no integral experimental data for code validation regarding the decay heat of MOX fuels, excepted fission burst experiments (for fission products contributions at short cooling times) or post-irradiated experiments on nuclide inventories (restricted number of nuclide of interest for decay heat). The uncertainty quantification mainly relies on uncertainty propagation of nuclear data covariances. In the recent years, the transposition method, based on the data assimilation theory, was used in order to transpose the experiment-to-calculation discrepancies at a given set of parameters (cooling time, fuel burnup) to another set of parameters. As an example, this method was used on the CLAB experiments and the experiment-to-calculation discrepancies at 13 years were transposed to an UOX fuel between 5 and 27 years and for burnups from 10 to 50 GWd/t. The purpose of this paper is to study to what extent the transposition method could be used for MOX fuels. In particular, the Dickens fission burst experiment of 239Pu was considered for MOX fuels at short cooling times (< 1h30) and low burnup (< 10 GWd/t). The impact of fission yields (FY) correlations was also discussed. As a conclusion, the efficiency of the transposition process is limited by the experimental uncertainties larger than nuclear data uncertainties, and by the fact that fission burst experiments would only be representative to the FY contribution of the decay heat uncertainty of an irradiated reactor fuel. Nevertheless, this method strengthens the decay heat uncertainties at very short cooling times, previously based only on nuclear data covariance propagation through computation.


Sign in / Sign up

Export Citation Format

Share Document