Targeted Reduction of p-Boxes in Risk Assessments With Mixed Aleatory and Epistemic Uncertainties

Author(s):  
Jeremy Rohmer

Abstract The treatment of uncertainty using extra-probabilistic approaches, like intervals or p-boxes, allows for a clear separation between epistemic uncertainty and randomness in the results of risk assessments. This can take the form of an interval of failure probabilities; the interval width W being an indicator of “what is unknown.” In some situations, W is too large to be informative. To overcome this problem, we propose to reverse the usual chain of treatment by starting with the targeted value of W that is acceptable to support the decision-making, and to quantify the necessary reduction in the input p-boxes that allows achieving it. In this view, we assess the feasibility of this procedure using two case studies (risk of dike failure, and risk of rupture of a frame structure subjected to lateral loads). By making the link with the estimation of excursion sets (i.e., the set of points where a function takes values below some prescribed threshold), we propose to alleviate the computational burden of the procedure by relying on the combination of Gaussian process (GP) metamodels and sequential design of computer experiments. The considered test cases show that the estimates can be achieved with only a few tens of calls to the computationally intensive algorithm for mixed aleatory/epistemic uncertainty propagation.

2013 ◽  
Vol 18 (7) ◽  
pp. 1393-1403 ◽  
Author(s):  
Julie Clavreul ◽  
Dominique Guyonnet ◽  
Davide Tonini ◽  
Thomas H. Christensen

Author(s):  
Alessandra Cuneo ◽  
Alberto Traverso ◽  
Shahrokh Shahpar

In engineering design, uncertainty is inevitable and can cause a significant deviation in the performance of a system. Uncertainty in input parameters can be categorized into two groups: aleatory and epistemic uncertainty. The work presented here is focused on aleatory uncertainty, which can cause natural, unpredictable and uncontrollable variations in performance of the system under study. Such uncertainty can be quantified using statistical methods, but the main obstacle is often the computational cost, because the representative model is typically highly non-linear and complex. Therefore, it is necessary to have a robust tool that can perform the uncertainty propagation with as few evaluations as possible. In the last few years, different methodologies for uncertainty propagation and quantification have been proposed. The focus of this study is to evaluate four different methods to demonstrate strengths and weaknesses of each approach. The first method considered is Monte Carlo simulation, a sampling method that can give high accuracy but needs a relatively large computational effort. The second method is Polynomial Chaos, an approximated method where the probabilistic parameters of the response function are modelled with orthogonal polynomials. The third method considered is Mid-range Approximation Method. This approach is based on the assembly of multiple meta-models into one model to perform optimization under uncertainty. The fourth method is the application of the first two methods not directly to the model but to a response surface representing the model of the simulation, to decrease computational cost. All these methods have been applied to a set of analytical test functions and engineering test cases. Relevant aspects of the engineering design and analysis such as high number of stochastic variables and optimised design problem with and without stochastic design parameters were assessed. Polynomial Chaos emerges as the most promising methodology, and was then applied to a turbomachinery test case based on a thermal analysis of a high-pressure turbine disk.


2011 ◽  
Vol 133 (2) ◽  
Author(s):  
Kais Zaman ◽  
Mark McDonald ◽  
Sankaran Mahadevan

This paper develops and illustrates a probabilistic approach for uncertainty representation and propagation in system analysis, when the information on the uncertain input variables and/or their distribution parameters may be available as either probability distributions or simply intervals (single or multiple). A unique aggregation technique is used to combine multiple interval data and to compute rigorous bounds on the system response cumulative distribution function. The uncertainty described by interval data is represented through a flexible family of probability distributions. Conversion of interval data to a probabilistic format enables the use of computationally efficient methods for probabilistic uncertainty propagation. Two methods are explored for the implementation of the proposed approach, based on (1) sampling and (2) optimization. The sampling-based strategy is more expensive and tends to underestimate the output bounds. The optimization-based methodology improves both aspects. The proposed methods are used to develop new solutions to challenge problems posed by the Sandia epistemic uncertainty workshop (Oberkampf et al., 2004, “Challenge Problems: Uncertainty in System Response Given Uncertain Parameters,” Reliab. Eng. Syst. Saf., 85, pp. 11–19). Results for the challenge problems are compared with earlier solutions.


2017 ◽  
Author(s):  
Matan Avital ◽  
Michael Davis ◽  
Ory Dor ◽  
Ronnie Kamai

Abstract. We present a full PSHA sensitivity analysis for two sites in southern Israel – one in the near-field of a major fault system and one farther away. The PSHA analysis is conducted for alternative source representations, using alternative model parameters for the main seismic sources, such as slip-rate and Mmax, among others. The analysis also considers the effect of the Ground-Motion Prediction Equation (GMPE) on the hazard results. In this way, the two types of epistemic uncertainty – modelling uncertainty and parametric uncertainty are treated and addressed. We quantify the uncertainty propagation by testing its influence of the final calculated hazard, such that the controlling knowledge gaps are identified and can be treated in future studies. We find that current practice in Israel, as represented by the most current version of the building code grossly underestimates the hazard, due to a combination of factors, including source definitions as well as the GMPE used for analysis.


Author(s):  
NICOLA PEDRONI ◽  
ENRICO ZIO

Risk analysis models describing aleatory (i.e., random) events contain parameters (e.g., probabilities, failure rates, …) that are epistemically-uncertain, i.e., known with poor precision. Whereas aleatory uncertainty is always described by probability distributions, epistemic uncertainty may be represented in different ways (e.g., probabilistic or possibilistic), depending on the information and data available. The work presented in this paper addresses the issue of accounting for (in)dependence relationships between epistemically-uncertain parameters. When a probabilistic representation of epistemic uncertainty is considered, uncertainty propagation is carried out by a two-dimensional (or double) Monte Carlo (MC) simulation approach; instead, when possibility distributions are used, two approaches are undertaken: the hybrid MC and Fuzzy Interval Analysis (FIA) method and the MC-based Dempster-Shafer (DS) approach employing Independent Random Sets (IRSs). The objectives are: i) studying the effects of (in)dependence between the epistemically-uncertain parameters of the aleatory probability distributions (when a probabilistic/possibilistic representation of epistemic uncertainty is adopted) and ii) studying the effect of the probabilistic/possibilistic representation of epistemic uncertainty (when the state of dependence between the epistemic parameters is defined). The Dependency Bound Convolution (DBC) approach is then undertaken within a hierarchical setting of hybrid (probabilistic and possibilistic) uncertainty propagation, in order to account for all kinds of (possibly unknown) dependences between the random variables. The analyses are carried out with reference to two toy examples, built in such a way to allow performing a fair quantitative comparison between the methods, and evaluating their rationale and appropriateness in relation to risk analysis.


Author(s):  
Jaka Propika ◽  
Dita Kamarul Fitriyah ◽  
Yanisfa Septiarsilia

ABSTRAK Penggunaan kolom komposit telah banyak digunakan di berbagai bangunan bangunan tinggi. Dan pada umumnya, Kolom komposit dibagi menjadi 2 macam, yaitu kolom komposit inside steel dan outside steel dengan struktur baja terbungkus oleh beton disebut dengan kolom inside steel atau bisa saja disebut Concrete Encased Column. Sedangkan untuk baja yang berisi beton disebut dengan kolom outside steel atau juga disebut Concrete Filled Column. Penggunaan struktur kolom komposit outside steel sebagai kolom utama dalam mendukung beban lateral pada struktur rangka bangunan belum lazim digunakan dalam perkembangan konstruksi saat ini. Oleh karena itu, perlu dilakukan analisa kekuatan dari 2 macam kolom komposit agar diketahui jenis kolom komposit yang paling efektif dan memiliki kekuatan paling tinggi. Perhitungan yang dilakukan dengan menggunakan perhitungan manual pada kolom komposit inside steel dan outside steel yang berbentuk kotak, sedangkan untuk perhitungan dengan menggunakan program CSICOL dilakukan pada seluruh kolom komposit. Hasil nilai ØPn dan ØMn kemudian dibandingkan antara perhitungan manual dengan program CSICOL. Hasil perhitungan menunjukan bahwa kemampuan kolom komposit outside steel lebih baik dibandingkan kolom komposit inside steel dengan menggunakan standar volume dari ukuran kolom komposit inside steel kotak 400x400 mm. Kolom komposit outside steel berbentuk bundar dengan diameter 431 mm lebih unggul sebesar 17 % dalam menahan gaya aksial nominal (ØPn) dibandingkan semua tipe kolom komposit yang lain. Sedangkan kolom komposit outside steel berbentuk kotak dengan ukuran 405.70x405.70 mm lebih unggul menahan momen nominal (ØMn) sebesar 10,5 % dibandingkan semua tipe kolom komposit yang lain.Kata kunci : kolom komposit; inside steel (concrete- encased column); outside steel (concrete-filled column)ABSTRACT The use of composite columns has been widely used in various high-rise buildings. Composite columns are generally divided into two types: composite columns inside steel and outside steel columns with a steel structure wrapped in concrete called an inside steel column (concrete encased column), while steel containing concrete is called an outside steel column (concrete-filled column). The use of a composite column structure outside steel as the main column in supporting lateral loads in the building frame structure is not yet commonly used in current construction developments. Therefore, it is necessary to consider the strengths of 2 types of composite columns to know which type of composite column is the most effective and has the highest strength. Calculations are performed using manual calculations on composite columns inside steel and outside steel in the form of a box, while calculations using the CSiCOL program are carried out on all composite columns. The results of the ØPn and ØMn values are then compared between manual calculations and the CSiCOL program. The calculation results show that the composite outside steel column's ability is better than the inside steel composite column by using a standard volume from the size of the composite column inside steel box 400x400 mm. The round composite outside steel column with a 431 mm diameter is 17% superior in withstanding nominal axial force (ØPn) than all other composite column types. While the outside steel composite column in the form of a box with a size of 405.70x405.70 mm is superior to withstand the little moment (ØMn) by 10.5% compared to all other types of composite columns. 


2021 ◽  
Vol 896 (1) ◽  
pp. 012035
Author(s):  
M Bougofa ◽  
A Bouafia ◽  
A Baziz ◽  
S Aberkane ◽  
R Kharzi ◽  
...  

Abstract Probabilistic modeling is widely used in industrial practices, particularly for assessing complex systems’ safety, risk analysis, and reliability. Conventional risk analysis methodologies generally have a limited ability to deal with dependence, failure behavior, and epistemic uncertainty such as parameter uncertainty. This work proposes a risk-based reliability assessment approach using a dynamic evidential network (DEN). The proposed model integrates Dempster-Shafer theory (DST) for describing parameter uncertainty with a dynamic Bayesian network (DBN) for dependency representation and multi-state system reliability. This approach treats uncertainty propagation across conditional belief mass tables (CBMT). According to the results acquired in an interval, it is possible to analyze the risk like interval theory, and ignoring this uncertainty may lead to prejudiced results. The epistemic uncertainty should be adequately defined before performing the risk analysis. A case study of a level control system is used to highlight the methodology’s ability to capture dynamic changes in the process, uncertainty modeling, and sensitivity analysis that can serve decision making.


Sign in / Sign up

Export Citation Format

Share Document