Meta-Risk as a Method for Addressing Uncertainty in a Pipeline Risk Management System

Author(s):  
Louis Fenyvesi ◽  
Brian Rothwell ◽  
Iain Colquhoun

Typical risk assessment processes produce risk estimates by multiplying together single-valued, expected failure frequencies and associated consequences. However, a range of consequences can result from an incident, and a more representative estimate of failure frequency is captured by a distributed variable rather than by a single point value. Risk estimates calculated by typical assessment processes are sometimes referred to as “mean” estimates or “cautious best estimates”. This terminology acknowledges implicitly that there is truly a range of possible values. Meta-risk is a potential approach for analyzing risk that captures this uncertainty by utilizing distributions of failure frequency and consequence in place of point estimates. These distributions are combined to form a risk distribution that can then be used more directly in quantified decision making. Meta-risk improves on the principle of “As low as reasonably practicable” (ALARP) by acknowledging that the levels of uncertainty associated with models used in the risk assessment process are not equal. By providing “probability of exceedance” targets relative to defined risk acceptance criteria, the meta-risk approach allows for quantified decision making that addresses both the level of risk and the associated level of uncertainty. This process allows an analyst to compare risks more accurately from multiple hazards between which levels of uncertainty may vary greatly, and to quantify the benefits of integrity management strategies such as condition monitoring whose primary effect is to reduce uncertainty rather than to reduce risk directly.

Author(s):  
Alejandro Reyes ◽  
Otto Huisman

Workflows are the fundamental building blocks of business processes in any organization today. These workflows have attributes and outputs that make up various Operational, Management and Supporting processes, which in turn produce a specific outcome in the form of business value. Risk Assessment and Direct Assessment are examples of such processes; they define the individual tasks integrity engineers should carry out. According to ISO 55000, achieving excellence in Asset Management requires clearly defined objectives, transparent and consistent decision making, as well as a long-term strategic view. Specifically, it recommends well-defined policies and procedures (processes) to bring about performance and cost improvements, improved risk management, business growth and enhanced stakeholder confidence through compliance and improved reputation. In reality, such processes are interpreted differently all over the world, and the workflows that make up these processes are often defined by individual engineers and experts. An excellent example of this is Risk Assessment, where significant local variations in data sources, threat sources and other data elements, require the business to tailor its activities and models used. Successful risk management is about enabling transparent decision-making through clearly defined process-steps, but in practice it requires maintaining a degree of flexibility to tailor the process to the specific organizational needs. In this paper, we introduce common building blocks that have been identified to make up a Risk Assessment process and further examine how these blocks can be connected to fulfill the needs of multiple stakeholders, including data administrators, integrity engineers and regulators. Moving from a broader Business Process view to a more focused Integrity Management view, this paper will demonstrate how to formalize Risk Assessment processes by describing the activities, steps and deliverables of each using Business Process Model and Notation (BPMN) as the standard modeling technique and extending it with an integrity-specific notation we have called Integrity Modelling Language or IML. It is shown that flexible modelling of integrity processes based on existing standards and best practices is possible within a structured approach; one which guides users and provides a transparent and auditable process inside the organization and beyond, based on commonalities defined by best practice guidelines, such as ISO 55000.


Author(s):  
Andrejs Radionovs ◽  
Oleg Uzhga-Rebrov

Being able to evaluate risks is an important task in many areas of human activity: economics, ecology, etc. Usually, environmental risk assessment is carried out on the basis of multiple and sometimes conflicting factors. Using multiple criteria decision-making (MCDM) methodology is one of the possible ways to solve the problem. Methodologies of analytic hierarchy process (AHP) are the most commonly used MCDM methods, which combine subjective and personal preferences in risk assessment process. However, AHP involves human subjectivity, which introduces vagueness type of uncertainty and requires the usage of decision making under those uncertainties. In this paper it was considered to deal with uncertainty by using the fuzzy-based techniques. However, nowadays there exist multiple Fuzzy AHP methodologies developed by different authors. In this paper, these Fuzzy AHP methodologies will be compared, and the most appropriate Fuzzy AHP methodology for the application in case of environmental risks assessment will be offered on the basis of this comparison.


Author(s):  
John W. Collins

Planning and decision making amidst programmatic and technological risks represent significant challenges for projects. This presentation addresses the four-step risk assessment process needed to determine a clear path forward to mature needed technology and design, license, and construct advanced first-of-a-kind nuclear power plants, including Small Modular Reactors. This four-step process has been carefully applied to the Next Generation Nuclear Plant.


2015 ◽  
Vol 198 (2) ◽  
pp. 204-211 ◽  
Author(s):  
Adaoha E. C. Ihekwaba ◽  
Ivan Mura ◽  
Pradeep K. Malakar ◽  
John Walshaw ◽  
Michael W. Peck ◽  
...  

Botulinum neurotoxins (BoNTs) produced by the anaerobic bacteriumClostridium botulinumare the most potent biological substances known to mankind. BoNTs are the agents responsible for botulism, a rare condition affecting the neuromuscular junction and causing a spectrum of diseases ranging from mild cranial nerve palsies to acute respiratory failure and death. BoNTs are a potential biowarfare threat and a public health hazard, since outbreaks of foodborne botulism are caused by the ingestion of preformed BoNTs in food. Currently, mathematical models relating to the hazards associated withC. botulinum, which are largely empirical, make major contributions to botulinum risk assessment. Evaluated using statistical techniques, these models simulate the response of the bacterium to environmental conditions. Though empirical models have been successfully incorporated into risk assessments to support food safety decision making, this process includes significant uncertainties so that relevant decision making is frequently conservative and inflexible. Progression involves encoding into the models cellular processes at a molecular level, especially the details of the genetic and molecular machinery. This addition drives the connection between biological mechanisms and botulism risk assessment and hazard management strategies. This review brings together elements currently described in the literature that will be useful in building quantitative models ofC. botulinumneurotoxin production. Subsequently, it outlines how the established form of modeling could be extended to include these new elements. Ultimately, this can offer further contributions to risk assessments to support food safety decision making.


Author(s):  
Iain R. Colquhoun ◽  
Evelyn Choong ◽  
Richard Kania ◽  
Ming Gao ◽  
Pat Wickenhauser

When the benefits of using risk-based decision making in pipeline integrity management programs have been identified, operators are immediately faced with the challenge of large amounts of risk analysis work. This work frequently has to be done with minimum resources and/or in logistic situations that require a graduated approach extending over several years. In answering this challenge, a starting point must be identified that focuses resources where the risks are greatest. Since these locations are generally unknown in the first instance, the need exists to have a tool available to perform a first or high-level assessment to identify areas requiring further or more detailed study to support the integrity management program. The need also exists to have a robust tool that can be used to direct the assessments of smaller lines that might not require the detailed attention generally given to larger diameter transmission lines. This paper describes the extension of a simple indexing methodology comprising both theoretical and historical components to produce such a tool. It describes the use of so-called “smart” defaults to account for missing data, and a rudimentary decision model that can be used to grade the risk results. Examples are given of applications of the methodology to a gathering system and to the high-level evaluation of a transmission system. The paper also compares the results obtained to other, more detailed methodologies.


2020 ◽  
Vol 12 (5) ◽  
pp. 93
Author(s):  
Jane Henriksen-Bulmer ◽  
Shamal Faily ◽  
Sheridan Jeary

Cyber Physical Systems (CPS) seamlessly integrate physical objects with technology, thereby blurring the boundaries between the physical and virtual environments. While this brings many opportunities for progress, it also adds a new layer of complexity to the risk assessment process when attempting to ascertain what privacy risks this might impose on an organisation. In addition, privacy regulations, such as the General Data Protection Regulation (GDPR), mandate assessment of privacy risks, including making Data Protection Impact Assessments (DPIAs) compulsory. We present the DPIA Data Wheel, a holistic privacy risk assessment framework based on Contextual Integrity (CI), that practitioners can use to inform decision making around the privacy risks of CPS. This framework facilitates comprehensive contextual inquiry into privacy risk, that accounts for both the elicitation of privacy risks, and the identification of appropriate mitigation strategies. Further, by using this DPIA framework we also provide organisations with a means of assessing privacy from both the perspective of the organisation and the individual, thereby facilitating GDPR compliance. We empirically evaluate this framework in three different real-world settings. In doing so, we demonstrate how CI can be incorporated into the privacy risk decision-making process in a usable, practical manner that will aid decision makers in making informed privacy decisions.


2018 ◽  
Vol 6 (3) ◽  
pp. 543-564 ◽  
Author(s):  
Paul R. Wyrwoll ◽  
R. Quentin Grafton ◽  
Katherine A. Daniell ◽  
Hoang Long Chu ◽  
Claudia Ringler ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document