Sensitivity Analysis of Epistemic Uncertainty on Input Parameters and System Structure Using Dempster-Shafer Theory

Author(s):  
Yunhui Hou

Abstract In this article, a method is proposed to conduct a global sensitivity analysis of epistemic uncertainty on both system input and system structure, which is very common in early stage of system development, using Dempster-Shafer theory (DST). In system reliability assessment, the input corresponds to component reliability and system structure is given by system reliability function, cut sets, or truth table. A method to propagate real-number mass function through set-valued mappings is introduced and applied on system reliability calculation. Secondly, we propose a method to model uncertain system with multiple possible structures and how to obtain the mass function of system level reliability. Finally, we propose an indicator for global sensibility analysis. Our method is illustrated, and its efficacy is proved by numerical application on two case studies.

Author(s):  
Atiye Sarabi-Jamab ◽  
Babak N. Araabi

Complexity of computations, particularly due to large number of focal elements (FEs), in Dempster-Shafer theory (DST) motivates the development of approximation algorithms. Existing approximation methods include efficient algorithm for special hypothesis space, Monte Carlo based techniques, and simplification procedures. In this paper, the quality of the simplification-based approximation algorithms is evaluated using a new information-based comparison methodology. To this end, three structured testbeds are introduced. Each testbed is designed with an eye on a particular form of uncertainty associated with a body of evidence (BoE) in DST, i.e., conflict and non-specificity. Three proposed testbeds along with the classic testbed are utilized to evaluate the accuracy and complexity of existing algorithms. In light of the proposed evaluation methodology, a new approximation method is presented as well. The proposed algorithm has the ability to choose the most informative FEs without being forced to select the FEs with the largest mass function. Comparison of overall qualitative performance of approximation algorithms provides accuracy versus computational time tradeoff to choose an appropriate approximation method in different applications. Moreover, experiments with testbeds indicate that our proposed algorithm enhances the accuracy and computational tractability simultaneously.


2021 ◽  
Author(s):  
Ias Sri Wahyuni ◽  
Rachid Sabre

In this article, we give a new method of multi-focus fusion images based on Dempster-Shafer theory using local variability (DST-LV). Indeed, the method takes into account the variability of observations of neighbouring pixels at the point studied. At each pixel, the method exploits the quadratic distance between the value of the pixel I (x, y) of the point studied and the value of all pixels which belong to its neighbourhood. Local variability is used to determine the mass function. In this work, two classes of Dempster-Shafer theory are considered: the fuzzy part and the focused part. We show that our method gives the significant and better result by comparing it to other methods.


Author(s):  
JOSE E. RAMIREZ-MARQUEZ ◽  
DAVID W. COIT ◽  
TONGDAN JIN

A new methodology is presented to allocate testing units to the different components within a system when the system configuration is fixed and there are budgetary constraints limiting the amount of testing. The objective is to allocate additional testing units so that the variance of the system reliability estimate, at the conclusion of testing, will be minimized. Testing at the component-level decreases the variance of the component reliability estimate, which then decreases the system reliability estimate variance. The difficulty is to decide which components to test given the system-level implications of component reliability estimation. The results are enlightening because the components that most directly affect the system reliability estimation variance are often not those components with the highest initial uncertainty. The approach presented here can be applied to any system structure that can be decomposed into a series-parallel or parallel-series system with independent component reliability estimates. It is demonstrated using a series-parallel system as an example. The planned testing is to be allocated and conducted iteratively in distinct sequential testing runs so that the component and system reliability estimates improve as the overall testing progresses. For each run, a nonlinear programming problem must be solved based on the results of all previous runs. The testing allocation process is demonstrated on two examples.


2020 ◽  
Author(s):  
BOUKARI WADJIDOU ◽  
Ivana Todorovic ◽  
Long Fenjie

Abstract Background Having a minimum number of workers in medical services is widely regarded as a key component of disease prevention. However, with the delay in confirming cases of SARS-CoV-2, the understaffed medical providers informed late and the virus has rapidly spread nationally. Methods This study, based on the Dempster-Shafer theory method and Evidential Reasoning, assesses the risks posed by understaffing for the SARS-CoV-2 outbreak. Results The findings examine six (6) factor risks and show that the understaffing risk in 2019 was 0.14% in magnitude in Wuhan, compared to 0.27% in Shenzhen. When ranking understaffing risks from low to high, the findings show that they increased from 3.979 to 3.983% and from 3.998 to 4.002% in Wuhan and Shenzhen, respectively. Conclusions We first conclude that from the SARS-CoV-1 to the SARS-CoV-2 outbreak, understaffing risk equally increased at 0.004% in both cities. However, Shenzhen city is at a higher risk than Wuhan city. Second, Shenzhen understaffing delayed SARS-CoV-2 outbreak prevention 0.13% more than Wuhan city. We generally conclude that Shenzhen city could be doubly worse off than Wuhan city if it was the epicenter of the SARS-CoV-2 outbreak. Therefore, public health care training and employment policy must be optimized to complete the lack not only in both cities but also in other cities to prevent future outbreaks.


Author(s):  
Luiz Alberto Pereira Afonso Ribeiro ◽  
Ana Cristina Bicharra Garcia ◽  
Paulo Sérgio Medeiros Dos Santos

The use of big data and information fusion in electronichealth records (EHR) allowed the identification of adversedrug reactions(ADR) through the integration of heteroge-neous sources such as clinical notes (CN), medication pre-scriptions, and pathological examinations. This heterogene-ity of data sources entails the need to address redundancy,conflict, and uncertainty caused by the high dimensionalitypresent in EHR. The use of multisensor information fusion(MSIF) presents an ideal scenario to deal with uncertainty,especially when adding resources of the theory of evidence,also called Dempster–Shafer Theory (DST). In that scenariothere is a challenge which is to specify the attribution of be-lief through the mass function, from the datasets, named basicprobability assignment (BPA). The objective of the presentwork is to create a form of BPA generation using analy-sis of data regarding causal and time relationships betweensources, entities and sensors, not only through correlation, butby causal inference.


Author(s):  
Sangjune Bae ◽  
Nam H. Kim ◽  
Seung-gyo Jang

Since the safety of a system is often assessed by the probability of failure, it is crucial to calculate the probability accurately in order to achieve the target safety. Despite such importance, calculating the precise probability is not a trivial task due to the inherent aleatory variability and epistemic uncertainty. Therefore the safety is assessed by a conservative estimate of the probability rather than using a single value of the probability. In general, there are two ways to achieve the target probability: Shifting the probability or reducing the uncertainty. In this paper, among various sources of epistemic uncertainty, the uncertainty quantification error from sampling is considered to calculate the conservative estimate of a system probability of failure. To quantify and shape the epistemic uncertainty, Bayesian network is utilized for constituting the relationship between the system probability and component probabilities, while global sensitivity analysis is employed to connect the variance in the probabilities in system level with that in the component level. Based on this, local sensitivity of the conservative estimate with respect to a design change in a component is derived and approximated for a simple numerical calculation using Bayesian network and global sensitivity analysis. This is to show how a design can meet the probabilistic criteria considering propagated uncertainty when the design changes.


Sign in / Sign up

Export Citation Format

Share Document