scholarly journals An efficient metamodeling approach for uncertainty quantification of complex systems with arbitrary parameter probability distributions

2016 ◽  
Vol 109 (5) ◽  
pp. 739-760 ◽  
Author(s):  
Hua-Ping Wan ◽  
Wei-Xin Ren ◽  
Michael D. Todd
Algorithms ◽  
2020 ◽  
Vol 13 (3) ◽  
pp. 51
Author(s):  
Dimitrios Loukrezis ◽  
Herbert De Gersem

Approximation and uncertainty quantification methods based on Lagrange interpolation are typically abandoned in cases where the probability distributions of one or more system parameters are not normal, uniform, or closely related distributions, due to the computational issues that arise when one wishes to define interpolation nodes for general distributions. This paper examines the use of the recently introduced weighted Leja nodes for that purpose. Weighted Leja interpolation rules are presented, along with a dimension-adaptive sparse interpolation algorithm, to be employed in the case of high-dimensional input uncertainty. The performance and reliability of the suggested approach is verified by four numerical experiments, where the respective models feature extreme value and truncated normal parameter distributions. Furthermore, the suggested approach is compared with a well-established polynomial chaos method and found to be either comparable or superior in terms of approximation and statistics estimation accuracy.


2009 ◽  
Author(s):  
Matthew D. Grace ◽  
James T. Ringland ◽  
Youssef M. Marzouk ◽  
Paul T. Boggs ◽  
Rena M. Zurn ◽  
...  

Author(s):  
Djamalddine Boumezerane

Abstract In this study, we use possibility distribution as a basis for parameter uncertainty quantification in one-dimensional consolidation problems. A Possibility distribution is the one-point coverage function of a random set and viewed as containing both partial ignorance and uncertainty. Vagueness and scarcity of information needed for characterizing the coefficient of consolidation in clay can be handled using possibility distributions. Possibility distributions can be constructed from existing data, or based on transformation of probability distributions. An attempt is made to set a systematic approach for estimating uncertainty propagation during the consolidation process. The measure of uncertainty is based on Klir's definition (1995). We make comparisons with results obtained from other approaches (probabilistic…) and discuss the importance of using possibility distributions in this type of problems.


Author(s):  
D. Bigoni ◽  
A. P. Engsig-Karup ◽  
H. True

This paper describes the results of the application of Uncertainty Quantification methods to a simple railroad vehicle dynamical example. Uncertainty Quantification methods take the probability distribution of the system parameters that stems from the parameter tolerances into account in the result. In this paper the methods are applied to a low-dimensional vehicle dynamical model composed by a two-axle truck that is connected to a car body by a lateral spring, a lateral damper and a torsional spring, all with linear characteristics. Their characteristics are not deterministically defined, but they are defined by probability distributions. The model — but with deterministically defined parameters — was studied in [1] and [2], and this article will focus on the calculation of the critical speed of the model, when the distribution of the parameters is taken into account. Results of the application of the traditional Monte Carlo sampling method will be compared with the results of the application of advanced Uncertainty Quantification methods [3]. The computational performance and fast convergence that result from the application of advanced Uncertainty Quantification methods is highlighted. Generalized Polynomial Chaos will be presented in the Collocation form with emphasis on the pros and cons of each of those approaches.


2020 ◽  
Vol 8 ◽  
Author(s):  
Brioch Hemmings ◽  
Matthew J. Knowling ◽  
Catherine R. Moore

Effective decision making for resource management is often supported by combining predictive models with uncertainty analyses. This combination allows quantitative assessment of management strategy effectiveness and risk. Typically, history matching is undertaken to increase the reliability of model forecasts. However, the question of whether the potential benefit of history matching will be realized, or outweigh its cost, is seldom asked. History matching adds complexity to the modeling effort, as information from historical system observations must be appropriately blended with the prior characterization of the system. Consequently, the cost of history matching is often significant. When it is not implemented appropriately, history matching can corrupt model forecasts. Additionally, the available data may offer little decision-relevant information, particularly where data and forecasts are of different types, or represent very different stress regimes. In this paper, we present a decision support modeling workflow where early quantification of model uncertainty guides ongoing model design and deployment decisions. This includes providing justification for undertaking (or forgoing) history matching, so that unnecessary modeling costs can be avoided and model performance can be improved. The workflow is demonstrated using a regional-scale modeling case study in the Wairarapa Valley (New Zealand), where assessments of stream depletion and nitrate-nitrogen contamination risks are used to support water-use and land-use management decisions. The probability of management success/failure is assessed by comparing the proximity of model forecast probability distributions to ecologically motivated decision thresholds. This study highlights several important insights that can be gained by undertaking early uncertainty quantification, including: i) validation of the prior numerical characterization of the system, in terms of its consistency with historical observations; ii) validation of model design or indication of areas of model shortcomings; iii) evaluation of the relative proximity of management decision thresholds to forecast probability distributions, providing a justifiable basis for stopping modeling.


Author(s):  
Idir Arab ◽  
Milto Hadjikyriakou ◽  
Paulo Eduardo Oliveira ◽  
Beatriz Santos

Abstract The star-shaped ordering between probability distributions is a common way to express aging properties. A well-known criterion was proposed by Saunders and Moran [(1978). On the quantiles of the gamma and F distributions. Journal of Applied Probability 15(2): 426–432], to order families of distributions depending on one real parameter. However, the lifetime of complex systems usually depends on several parameters, especially when considering heterogeneous components. We extend the Saunders and Moran criterion characterizing the star-shaped order when the multidimensional parameter moves along a given direction. A few applications to the lifetime of complex models, namely parallel and series models assuming different individual components behavior, are discussed.


2011 ◽  
Vol 10 (1) ◽  
pp. 140-160 ◽  
Author(s):  
Akil Narayan ◽  
Dongbin Xiu

AbstractIn this work we consider a general notion ofdistributional sensitivity, which measures the variation in solutions of a given physical/mathematical system with respect to the variation of probability distribution of the inputs. This is distinctively different from the classical sensitivity analysis, which studies the changes of solutions with respect to the values of the inputs. The general idea is measurement of sensitivity of outputs with respect to probability distributions, which is a well-studied concept in related disciplines. We adapt these ideas to present a quantitative framework in the context of uncertainty quantification for measuring such a kind of sensitivity and a set of efficient algorithms to approximate the distributional sensitivity numerically. A remarkable feature of the algorithms is that they do not incur additional computational effort in addition to a one-time stochastic solver. Therefore, an accurate stochastic computation with respect to a prior input distribution is needed only once, and the ensuing distributional sensitivity computation for different input distributions is a post-processing step. We prove that an accurate numericalmodel leads to accurate calculations of this sensitivity, which applies not just to slowly-converging Monte-Carlo estimates, but also to exponentially convergent spectral approximations. We provide computational examples to demonstrate the ease of applicability and verify the convergence claims.


Sign in / Sign up

Export Citation Format

Share Document