Rule-Based Topological Vulnerability Analysis

Author(s):  
Vipin Swarup ◽  
Sushil Jajodia ◽  
Joseph Pamula
2012 ◽  
Vol 66 (8) ◽  
pp. 1766-1773 ◽  
Author(s):  
J. Yazdi ◽  
S. A. A. S. Neyshabouri

Population growth and urbanization in the last decades have increased the vulnerability of properties and societies in flood-prone areas. Vulnerability analysis is one of the main factors used to determine the necessary measures of flood risk reduction in floodplains. At present, the vulnerability of natural disasters is analyzed by defining the various physical and social indices. This study presents a model based on a fuzzy rule-based system to address various ambiguities and uncertainties from natural variability, and human knowledge and preferences in vulnerability analysis. The proposed method is applied for a small watershed as a case study and the obtained results are compared with one of the index approaches. Both approaches present the same ranking for the sub-basin's vulnerability in the watershed. Finally, using the scores of vulnerability in different sub-basins, a vulnerability map of the watershed is presented.


Author(s):  
Steven Noel ◽  
Matthew Elder ◽  
Sushil Jajodia ◽  
Pramod Kalapa ◽  
Scott O'Hare ◽  
...  

2019 ◽  
Vol 9 (3) ◽  
pp. 412-425 ◽  
Author(s):  
Seyed Ashkan Zarghami ◽  
Indra Gunawan ◽  
Frank Schultmann

Purpose The increased complexity of water distribution networks (WDNs) emphasizes the importance of studying the relationship between topology and vulnerability of these networks. However, the few existing studies on this subject measure the vulnerability at a specific location and ignore to quantify the vulnerability as a whole. The purpose of this paper is to fill this gap by extending the topological vulnerability analysis further to the global level. Design/methodology/approach This paper introduces a two-step procedure. In the first step, this work evaluates the degree of influence of a node by employing graph theory quantities. In the second step, information entropy is used as a tool to quantify the global vulnerability of WDNs. Findings The vulnerability analysis results showed that a network with uniformly distributed centrality values exhibits a lower drop in performance in the case of partial failure of its components and therefore is less vulnerable. In other words, the failure of a highly central node leads to a significant loss of performance in the network. Practical implications The vulnerability analysis method, developed in this work, provides a decision support tool to implement a cost-effective maintenance strategy, which relies on identifying and prioritizing the vulnerabilities, thereby reducing expenditures on maintenance activities. Originality/value By situating the research in the entropy theory context, for the first time, this paper demonstrates how heterogeneity and homogeneity of centrality values measured by the information entropy can be interpreted in terms of the network vulnerability.


Author(s):  
Leonardo B. L. Santos ◽  
Aurelienne A. S. Jorge ◽  
Luciana R. Londe ◽  
Regina T. Reani ◽  
Roberta B. Bacelar ◽  
...  

Abstract. The measurement and mapping of transportation network vulnerability constitute subjects of global interest. During a flood, some elements of a transportation network can be reached, causing damages directly (to people, vehicles and roads/streets) and indirect damages (services) with great economic impacts. The Complex Networks approach may offer a valuable perspective considering one type of vulnerability especially related to Disaster Risk Reduction on critical infrastructures: the topological vulnerability. The topological vulnerability index associated to an element in a graph is defined as the damage (variation) on the network’s average efficiency due to the removal of that element. We have performed a topological vulnerability analysis to the highways in the state of Santa Catarina (Brazil), and produced a risk map considering that index and the flood susceptible areas. Our results can represent an important tool for stakeholders from the transportation sector.


1992 ◽  
Vol 23 (1) ◽  
pp. 52-60 ◽  
Author(s):  
Pamela G. Garn-Nunn ◽  
Vicki Martin

This study explored whether or not standard administration and scoring of conventional articulation tests accurately identified children as phonologically disordered and whether or not information from these tests established severity level and programming needs. Results of standard scoring procedures from the Assessment of Phonological Processes-Revised, the Goldman-Fristoe Test of Articulation, the Photo Articulation Test, and the Weiss Comprehensive Articulation Test were compared for 20 phonologically impaired children. All tests identified the children as phonologically delayed/disordered, but the conventional tests failed to clearly and consistently differentiate varying severity levels. Conventional test results also showed limitations in error sensitivity, ease of computation for scoring procedures, and implications for remediation programming. The use of some type of rule-based analysis for phonologically impaired children is highly recommended.


Author(s):  
Bettina von Helversen ◽  
Stefan M. Herzog ◽  
Jörg Rieskamp

Judging other people is a common and important task. Every day professionals make decisions that affect the lives of other people when they diagnose medical conditions, grant parole, or hire new employees. To prevent discrimination, professional standards require that decision makers render accurate and unbiased judgments solely based on relevant information. Facial similarity to previously encountered persons can be a potential source of bias. Psychological research suggests that people only rely on similarity-based judgment strategies if the provided information does not allow them to make accurate rule-based judgments. Our study shows, however, that facial similarity to previously encountered persons influences judgment even in situations in which relevant information is available for making accurate rule-based judgments and where similarity is irrelevant for the task and relying on similarity is detrimental. In two experiments in an employment context we show that applicants who looked similar to high-performing former employees were judged as more suitable than applicants who looked similar to low-performing former employees. This similarity effect was found despite the fact that the participants used the relevant résumé information about the applicants by following a rule-based judgment strategy. These findings suggest that similarity-based and rule-based processes simultaneously underlie human judgment.


Sign in / Sign up

Export Citation Format

Share Document