scholarly journals Unified Reliability Measure Method Considering Uncertainties of Input Variables and Their Distribution Parameters

2021 ◽  
Vol 11 (5) ◽  
pp. 2265
Author(s):  
Yufeng Lyu ◽  
Zhenyu Liu ◽  
Xiang Peng ◽  
Jianrong Tan ◽  
Chan Qiu

Aleatoric and epistemic uncertainties can be represented probabilistically in mechanical systems. However, the distribution parameters of epistemic uncertainties are also uncertain due to sparsely available or inaccurate uncertainty information. Therefore, a unified reliability measure method that considers uncertainties of input variables and their distribution parameters simultaneously is proposed. The uncertainty information for distribution parameters of epistemic uncertainties could be as a result of insufficient data or interval information, which is represented with evidence theory. The probability density function of uncertain distribution parameters is constructed through fusing insufficient data and interval information based on a Gaussian interpolation algorithm, and the epistemic uncertainties are represented using a weighted sum of probability variables based on discrete distribution parameters. The reliability index considering aleatoric and epistemic uncertainties is calculated around the most probable point. The effectiveness of the proposed algorithm is demonstrated through comparison with the Monte Carlo method in the engineering example of a crank-slider mechanism and composite laminated plate.

2016 ◽  
Vol 837 ◽  
pp. 64-67
Author(s):  
Katarina Tvrda

The probabilistic design analyses a plate involving uncertain input parameters. These input parameters (geometry, material properties, boundary conditions, etc.) are defined in the software model. The variations of input parameters are defined as random input variables and are characterized by their distribution type (Gaussian, lognormal, etc.) and by their distribution parameters (mean values, standard deviation, etc.). During a probabilistic analysis, software executes multiple analysis loops to compute the random output parameters as a function of the set of random input variables. The values for the input variables are generated either randomly (using Monte Carlo simulation) or as prescribed samples (using Response Surface Methods). In the conclusion, some results of these probabilistic methods are presented.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Lei Cheng ◽  
Zhenzhou Lu ◽  
Luyi Li

An extending Borgonovo’s global sensitivity analysis is proposed to measure the influence of fuzzy distribution parameters on fuzzy failure probability by averaging the shift between the membership functions (MFs) of unconditional and conditional failure probability. The presented global sensitivity indices can reasonably reflect the influence of fuzzy-valued distribution parameters on the character of the failure probability, whereas solving the MFs of unconditional and conditional failure probability is time-consuming due to the involved multiple-loop sampling and optimization operators. To overcome the large computational cost, a single-loop simulation (SLS) is introduced to estimate the global sensitivity indices. By establishing a sampling probability density, only a set of samples of input variables are essential to evaluate the MFs of unconditional and conditional failure probability in the presented SLS method. Significance of the global sensitivity indices can be verified and demonstrated through several numerical and engineering examples.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Zhiqiang Wang ◽  
Zhenyu Lei

Abstract In order to study the waterproof performance of elastic rubber gasket in shield tunnel lining joints, an innovative sensitivity analysis method is proposed by combining the Monte Carlo method with the stochastic finite element method (FEM) in this paper. The sensitivity values of the waterproof performance respecting to elastic rubber gaskets are obtained via the ANSYS Probabilistic Design System (PDS) module, in which the parameters of material hardness, coordinates of the hole center, apertures are selected as random input variables. Meantime, the extent of the tolerance effect of the random parameters on the waterproof performance is explored.


2013 ◽  
Vol 329 ◽  
pp. 344-348
Author(s):  
Shao Pu Zhang ◽  
Tao Feng

Evidence theory is an effective method to deal with uncertainty information. And uncertainty measure is to reflect the uncertainty of an information system. Thus we want to merge evidence theory with uncertainty method in order to measure the roughness of a rough approximation space. This paper discusses the information fusion and uncertainty measure based on rough set theory. First, we propose a new method of information fusion based on the Bayse function, and define a pair of belief function and plausibility function using the fused mass function in an information system. Then we construct entropy for every decision class to measure the roughness of every decision class, and entropy for decision information system to measure the consistence of decision table.


2013 ◽  
Vol 20 (2) ◽  
pp. 249-262 ◽  
Author(s):  
Sergiusz Sienkowski

Abstract The paper is concerned with issues of the estimation of random variable distribution parameters by the Monte Carlo method. Such quantities can correspond to statistical parameters computed based on the data obtained in typical measurement situations. The subject of the research is the mean, the mean square and the variance of random variables with uniform, Gaussian, Student, Simpson, trapezoidal, exponential, gamma and arcsine distributions.


Author(s):  
Yao Cheng ◽  
Xiaoping Du

Distributions of input variables of a limit-state function are required for reliability analysis. The distribution parameters are commonly estimated using samples. If some of the samples are in the form of intervals, the estimated distribution parameters will also be given in intervals. Traditional reliability methodologies assume that interval distribution parameters are independent, but as shown in this study, the parameters are actually dependent since they are estimated from the same set of samples. This study investigates the effect of the dependence of distribution parameters on the accuracy of reliability analysis results. The major approach is numerical simulation and optimization. This study indicates that the independent distribution parameter assumption makes the estimated reliability bounds wider than the true bounds due to interval samples. The reason is that the actual combination of the distribution parameters may not include the entire box-type domain assumed by the independent interval parameter assumption. The results of this study not only reveal the cause of the inaccuracy of the independent distribution parameter assumption, but also demonstrate a need of developing new reliability methods to accommodate dependent distribution parameters.


SPE Journal ◽  
2018 ◽  
Vol 23 (05) ◽  
pp. 1566-1579 ◽  
Author(s):  
Y. Z. Ma

Summary One of the most-important bases for field-development planning is the estimate of hydrocarbon initially in place (HIIP), which has been traditionally estimated either deterministically or by Monte Carlo simulation. The classical volumetric calculation is the most-common deterministic method, and it requires the use of the averages of the reservoir variables and thus does not model the correlations of the input variables. It is well-known that ignoring the correlations among the reservoir variables can lead to incorrect estimations of hydrocarbon volumetrics. The Monte Carlo method uses the input means in the volumetric equation for random simulation of hydrocarbon volumes, yet allows modeling of the correlation between the input parameters. However, using the means and modeling the correlation of the properties are theoretically conflicting. This paper presents new parametric equations for volumetric calculation using mathematical correlation. Unlike the classical volumetric calculation, these equations are the exact expressions of the rigorous hydrocarbon volumetric equation. We discuss how these new equations enable the quantification of inaccuracy in hydrocarbon volumetric estimation by the classical method. Our examples will further show that the magnitude of inaccuracy of the traditional volumetrics depends on the reservoir characteristics; the inaccuracy is generally more significant for heterogeneous, low-quality, and tight reservoirs than for relatively homogeneous high-quality reservoirs.


Author(s):  
Changcong Zhou ◽  
Zhenzhou Lu ◽  
Guijie Li

Variance-based importance measure has proven itself as an effective tool to reflect the effects of input variables on the output. Owing to the desirable properties, researchers have paid lots of attention to improving efficiency in computing a variance-based importance measure. Based on the theory of point estimate, this article proposes a new algorithm, decomposing the importance measure into inner and outer parts, and computing each part with a point estimate method. In order to discuss the impacts on the variance-based importance measure from distribution parameters of input variables, a new concept of kernel sensitivity of the variance-based importance measure is put forward, with solving algorithms respectively, based on numerical simulation and point estimate established as well. For cases where the performance function with independent and normally distributed input variables is expressed by a linear or quadratic polynomial without cross-terms, analytical results of the variance-based importance measure and the kernel sensitivity are derived. Numerical and engineering examples have been employed to illustrate the applicability of the proposed concept and algorithm.


2020 ◽  
pp. 1-24
Author(s):  
Pan Wang ◽  
Haihe Li ◽  
Xiaoyu Huang ◽  
Zheng Zhang ◽  
Sinan Xiao

Abstract For the reliability-oriented sensitivity analysis with respect to the parameters of input variables, by introducing the copula function to describe the joint probability distribution with dependent input variables, the reliability-oriented sensitivity can be decomposed into independent sensitivity and dependent sensitivity, which can be used to measure the influence of distribution parameters separately. Since the parameters of multivariate copula function are difficult to be estimated and not flexible in high dimension, the bivariate copulas are preferred in practice. Then the vine copula model is employed to transform the multivariate joint probability density function (PDF) into the product of multiple bivariate copulas and marginal PDF of all variables. Based on copula theory, the computation of reliability-oriented sensitivity with dependent variables can be transformed into the computation of a kernel function for each marginal PDF and the computation of a copula kernel function for each pair-copula PDF involved in the vine factorization. A general numerical approach is proposed to compute the separate sensitivity. Then, some numerical examples and engineering applications are employed to validate the rationality of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document