A Validation Strategy for Semi-Quantitative Risk Assessment in Pipeline Integrity Management

Author(s):  
Iain Colquhoun ◽  
Shahani Kariyawasam ◽  
Bill Gu ◽  
Evelyn Choong ◽  
Zupei Yang ◽  
...  

Public concern and pressure from regulatory bodies are accelerating the need for pipeline operators to formalize and intensify their approach to integrity management. There is a growing acceptance of risk-based approaches. Semi-quantitative methods are attractive since they make the most efficient use of data available. However the risk models have to be validated and customized to specific pipeline systems. The strategy to do this has to be based in a clear understanding of the specific risk methodology used and industry requirements of the risk assessments it produces. The paper describes a validation strategy that can be used when only sparse data are available and which can act as a framework to incorporate additional data from integrity assessments and operational experience as these become available.

2018 ◽  
Vol 46 (2) ◽  
pp. 185-209 ◽  
Author(s):  
Laurel Eckhouse ◽  
Kristian Lum ◽  
Cynthia Conti-Cook ◽  
Julie Ciccolini

Scholars in several fields, including quantitative methodologists, legal scholars, and theoretically oriented criminologists, have launched robust debates about the fairness of quantitative risk assessment. As the Supreme Court considers addressing constitutional questions on the issue, we propose a framework for understanding the relationships among these debates: layers of bias. In the top layer, we identify challenges to fairness within the risk-assessment models themselves. We explain types of statistical fairness and the tradeoffs between them. The second layer covers biases embedded in data. Using data from a racially biased criminal justice system can lead to unmeasurable biases in both risk scores and outcome measures. The final layer engages conceptual problems with risk models: Is it fair to make criminal justice decisions about individuals based on groups? We show that each layer depends on the layers below it: Without assurances about the foundational layers, the fairness of the top layers is irrelevant.


2021 ◽  
Vol 20 (1) ◽  
Author(s):  
Katherine Huerne

Background: Direct-to-consumer genetic testing (DTC-GT) is a popular and fast-growing field within the healthcare industry. Consumers often pursue DTC-GT without a clear understanding of its epistemic and medical limitations. This report will present the current state of DTC-GT technology, and highlight the ethical, legal and social issues of DTC-GT. Methods: Quantitative methods such as systematic reviews were used to evaluate the field of DTC-GT. Experimental data was taken from randomized control trials and case studies of 23andMe. Qualitative methods such as newspaper articles and surveys were also used. Relevant policies and regulatory information were analyzed in the context of 23andMe. Broader ethical issues are analyzed from the social disability model and feminist ethics frameworks. Results: Several aspects of direct-to-consumer genetic testing are outlined: (i) regulatory and legal distinctions of DTC-GT that separate its use from conventional genetic testing, (ii) epistemic issues of the genetic testing process within the direct-to-consumer context, and (iii) ethical considerations of DTC-GT in regard to genetic health and genetic ancestry. Conclusion: This report does not take a position for or against the use of DTC-GT; rather, it highlights the key ethical issues often missed in the DTC-GT process. There is no perfect method for understanding genetic health and race. DTC-GT offer consumers the ease and power of taking genetic data ‘in their own hands’, at the cost of exacerbating geneticization and race essentialism. Until further work is done to address the epistemic, regulatory and legal issues, ethical implications of DTC-GT usage will continue to exist.


Author(s):  
Rodolfo B. Sancio ◽  
Patricia Varela ◽  
David Vance ◽  
Kourosh Abdolmaleki ◽  
Millan Sen

Abstract Pipeline river crossings are typically managed by using a combination of flood monitoring, ground inspections, integrity assessments, and remediations. Using a probabilistic model to assess the likelihood of failure at river crossings would enable combined consideration of all factors that contribute to the failure threat, provide site rankings to support discrete mitigation prioritizations, allow for evaluation of whether a crossing is acceptable in regard to a risk target, and provide a “check” to the deterministic integrity management methods. This paper describes two models for estimating the pipeline probability of failure at river crossings. The first model is a qualitative scoring model that can be easily implemented by operators and consultants. This model employs a weighting-factors approach to consider the multiple variables that contribute to pipeline exposures and overstress given exposure. The results may be applied to threat rank diverse crossings, as well estimate the probability of failure at a crossing relative to that at historical failure sites. The second model is a semi-quantitative model that 1) estimates the likelihood of a crossing exposure occurring, 2) estimates the associated scour length, 3) assesses the pipelines critical span length, and 4) quantifies the probability that a span length longer than the critical span length could form. This model may be applied to achieve the same goals as the qualitative model, and also compare the probability of failure at a river crossing to a reliability target. Due to the complexity of this model and the paper length limits, it is conceptually described within this paper. The results demonstrated that the model output site rankings correlated reasonably with those estimated by pipeline integrity program managers, the scour depth and length prediction results were consistent with measured historical scours, and the pipeline probability of failure at the assessed river crossings were within expected ranges.


Author(s):  
Robert W. Brewerton ◽  
Paul Geddes ◽  
Sava Medonos ◽  
Raghu Raman ◽  
Christopher C. E. Wilkins

The research and development activities following the Piper Alpha disaster have resulted in significantly improved technical safety of oil & gas facilities offshore and onshore. This improved technical safety resulted from the development of goal-setting, risk-based approach, the objective of which was to open the routes for design optimization and remove previous constraints that addressed the worst case and was prescriptive. Despite this initiative, a Quantitative Risk Assessment (QRA), while still being carried out, often remains “disconnected” from the practical design and prescriptive methods still take precedence. Resorting solely to a prescriptive approach can result in adequate protection missing in places where it should be, and applied in areas where there is a low likelihood of the hazard. This Paper addresses the application in the facility design, risk based methods and known behavior of structures and equipment in accidents. It stresses the importance of practical experience in the application of fire and explosion protection, and adequate design and operational experience. The Paper focuses on fire and explosion hazards and is based on more than 30 years of the authors’ experience in supporting facility design and assessment. Such approach has resulted in solutions with improved technical safety and significant cost-savings. It addresses both new installations and modifications of existing facilities.


Author(s):  
Kaso Teha Nura ◽  
Fentaw Said Endris

This study aimed to assess the levels of community awareness to the environmental effects of growing use of forest product for peoples’ livelihood and their management practices in Jimma Zone. Thus, local farming communities, Development Agents (DA), Agriculture and Natural resource, Forest and Environmental Protection Officials & Experts are the participant of this study. The study employed descriptive survey research design and both qualitative and quantitative methods of data collections were employed. To assess community environmental awareness survey questionnaire questions (both open and close ended) were distributed for 240 sample respondents. An interview with10 Key informant interviews were also conducted with the head of natural resource management offices of selected woredas and six development agents (DA) in sample selected kebeles and four focus group discussions (FDG) consisted of 10 members were also employed to collect qualitative data.The findings show that all the respondents aware of forest and natural resource degradation about (87%) and (75.4%) were aware of clearing of forest to expand farmland for growing population and cutting trees for fuel wood, charcoal and other forest productsrespectively. Only very few of the respondents were indicate that lack of community awareness to sustainable use and management of forest resource (44.5%)and lack of clear understanding of forest laws and regulation among the community (40.4%)as a cause of deforestation. Therefore, the assessment of community awareness to forest resource degradation survey result shows that all of the farmers in the study area have been aware of natural resource and environmental degradation. Based on the findings of this study, it is recommended that there is a need of modifying educational/training programs for local communities considering the existing knowledge and practices in a particular area.


2020 ◽  
Vol 143 (3) ◽  
Author(s):  
Damir Tadjiev

Abstract Dynamic flexible risers are complex engineered systems, which provide a connection between topside (normally floating) facilities and subsea pipeline infrastructure on offshore oilfields. Such systems require the use of ancillary equipment to ensure the riser’s correct configuration is maintained throughout the service life. Industry experience shows that the integrity management of riser ancillary equipment is not always comprehensive, and failure of such equipment is one of the causes of premature removal of flexible risers from service. This article presents some case studies from the operational experience of dynamic flexible risers by an operator in the UK North Sea covering a period of approximately 20 years. The case studies look at the anomalies identified in service by general visual inspection (GVI) using a remotely operated vehicle (ROV) and the lessons learned. Some of the anomalies, had they not been identified and addressed promptly, could have resulted in costly repairs, which demonstrates the importance of inspecting the ancillary equipment of flexible risers as a part of the riser integrity management strategy. The challenges associated with integrity management of ancillary equipment of dynamic flexible risers are also discussed. The case studies presented in this article demonstrate that ROV GVI is an effective method for identifying installation and in-service anomalies related to flexible riser ancillary equipment. The purpose of this article is to share lessons learned with the wider offshore oil and gas community. It is also believed that the information presented in this article may provide useful information to other users of dynamic flexible riser systems when developing and/or implementing their subsea pipelines integrity management programs.


Author(s):  
Yannick Beauregard ◽  
Aaron Woo ◽  
Terry Huang

Pipeline risk models are used to prioritize integrity assessments and mitigative actions to achieve acceptable levels of risk. Some of these models rely on scores associated with parameters known or thought to contribute to a particular threat. For pipelines without in-line inspection (ILI) or direct assessment data, scores are often estimated by subject matter experts and as a result, are highly subjective. This paper describes a methodology for reducing the subjectivity of risk model scores by quantitatively deriving the scores based on ILI and failure data. This method is applied to determine pipeline coating and soil interaction scores in an external corrosion likelihood model for uninspected pipelines. Insights are drawn from the new scores as well as from a comparison with scores developed by subject matter experts.


Author(s):  
Jane Dawson ◽  
Iain Colquhoun ◽  
Inessa Yablonskikh ◽  
Russell Wenz ◽  
Tuan Nguyen

Current risk assessment practice in pipeline integrity management tends to use semi-quantitative index-based or model-based methodologies. This approach has been found to be very flexible and provide useful results for identifying high-risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability-based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provide greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach to suit the operator’s data availability and quality, and analysis needs. The paper also discusses experiences of implementing this type of risk model in Pipeline Integrity Management System (PIMS) software and the use of and integration of data via existing pipeline geographical information systems (GIS).


Sign in / Sign up

Export Citation Format

Share Document