Qualitative and Semi-Quantitative Model for Estimating the Probability of Failure at River Crossings

Author(s):  
Rodolfo B. Sancio ◽  
Patricia Varela ◽  
David Vance ◽  
Kourosh Abdolmaleki ◽  
Millan Sen

Abstract Pipeline river crossings are typically managed by using a combination of flood monitoring, ground inspections, integrity assessments, and remediations. Using a probabilistic model to assess the likelihood of failure at river crossings would enable combined consideration of all factors that contribute to the failure threat, provide site rankings to support discrete mitigation prioritizations, allow for evaluation of whether a crossing is acceptable in regard to a risk target, and provide a “check” to the deterministic integrity management methods. This paper describes two models for estimating the pipeline probability of failure at river crossings. The first model is a qualitative scoring model that can be easily implemented by operators and consultants. This model employs a weighting-factors approach to consider the multiple variables that contribute to pipeline exposures and overstress given exposure. The results may be applied to threat rank diverse crossings, as well estimate the probability of failure at a crossing relative to that at historical failure sites. The second model is a semi-quantitative model that 1) estimates the likelihood of a crossing exposure occurring, 2) estimates the associated scour length, 3) assesses the pipelines critical span length, and 4) quantifies the probability that a span length longer than the critical span length could form. This model may be applied to achieve the same goals as the qualitative model, and also compare the probability of failure at a river crossing to a reliability target. Due to the complexity of this model and the paper length limits, it is conceptually described within this paper. The results demonstrated that the model output site rankings correlated reasonably with those estimated by pipeline integrity program managers, the scour depth and length prediction results were consistent with measured historical scours, and the pipeline probability of failure at the assessed river crossings were within expected ranges.

Author(s):  
Vlad Semiga ◽  
Aaron Dinovitzer

Fitness for service assessments of oil and gas pipelines, conducted either at the design stage or to evaluate an indentified anomaly, are generally carried out in a deterministic manner based on conservative estimates of the required input parameters. The following paper presents a probabilistic Fitness-for-Service (FFS) assessment approach which can be used in a risk based pipeline integrity management program. The probabilistic assessment utilizes an Advanced Monte Carlo simulation based approach and the fracture mechanics techniques described in BS 7910. The paper presents an overview of the basic approach and provides a demonstration of its capabilities in terms of estimating the risk of failure (or probability of failure) associated with a pipeline over time, due to the presence of a crack like flaw. The paper also discusses the sources of data and inherent assumptions used to model various input parameters required for a typical FFS analysis carried out according to BS 7910.


1977 ◽  
Vol 25 (2) ◽  
pp. 133-153 ◽  
Author(s):  
R. R. Vera ◽  
J. G. Morris ◽  
Ling-Jung Koong

SUMMARYA quantitative model of energy intake and utilization by ewes in various physiological states grazing a pasture of perennial ryegrass is presented. The model consists of two major components: a forage intake component incorporating forage availability and maximum potential intake as affected by age of the grass and physiological state of the ewe; a second component representing energy utilization by the ewe for maintenance, growth, pregnancy, lactation and wool growth.The behaviour of the model was examined and found to be consistent with published information. Partial validation of the model was accomplished by comparing the model output with actual experimental data not used to construct the model.


Author(s):  
Jose Luis Martinez Gonzalez ◽  
Enrique Rodriguez Betancourt ◽  
Roberto Ramirez ◽  
Lorenzo Martinez Gomez ◽  
Arturo Godoy Simon

Some pipeline operators evaluate risk on an individual pipeline basis even if the right of way (ROW) is shared with other pipelines. Determining a ROW strip risk condition may be complex or quite simple, according to the model adopted by the analyst. If the pipelines allocated in a shared ROW belong to different operators it is very likely that they apply different methods to evaluate a risk condition. The relative risk contribution cannot be added to estimate the risk of a ROW strip. In Mexico insurance companies request studies of collective risk in pipelines to decide whether to increase a prime or reduce coverage. This request does not have technical support or engineering guidelines to perform the analysis. In Pemex there are few documented events where a pipeline failure affects parallel pipelines, known as collateral damage. There are some methods to estimate a potential collateral damage as a function of soil damping and separation between pipelines (Ref.2). This scheme applies for gas pipelines and has to be complemented with an ignition scenario probabilistic analysis. In the case of hazardous liquids scenarios of leak and rupture have to be considered, including potential shed routes, product concentration sites and operator response capability. Since risk is assessed with particular and specific attributes of a pipeline the probability of failure cannot be directly added to adjacent pipelines. There are some failure mechanisms common for pipelines sharing the ROW, such as external corrosion and stress corrosion cracking (SCC), with different intensity when considering coating and corrosion protection (CP) efficiency. Internal corrosion depends on other factors such as product features so it does not necessarily repeat with the same magnitude in all pipelines. Pipeline threats can be expected to be the same in this case — with different intensity. For instance, third party activity and weather can threaten all pipelines allocated in the same ROW. These pipelines may present similar symptoms with different magnitude. Cover depth, additional protection and wall thickness play an important role in reducing third party (TP) and weather and outside forces (WOF) threats. The paper provides risk results of a ROW strip based on probability of failure values. Pipelines with biggest risk contribution were identified and integrity management alignment diagrams were obtained to correlate with risk values. A simple algorithm was developed to process risk results in terms on shared ROW buffer dimensions. The study is complemented with the results of a consequence simulation analysis for a gas pipeline


Author(s):  
Robert A. McElroy

Recently enacted U.S. regulations will require distribution system operators to develop Distribution Integrity Management Programs (DIMP). The purpose of this regulation is to reduce system operating risks and the probability of failure by requiring operators to establish a documented, systematic approach to evaluating and managing risks associated with their pipeline systems. Distribution Integrity Management places new and significant requirements on distribution operators’ Geographic Information System (GIS). Operators already gather much of the data needed for meeting this regulation. The challenge lies in efficiently and accurately integrating and evaluating all system data so operators can identify and implement measures to address risks, monitor progress and report on results. Similar to the role geospatial solutions played in helping transmission pipeline operators meet Integrity Management Program requirements, this paper will discuss the role GIS can play in helping operators meet the DIMP regulations. Data requirements, storage and integration will also be presented. The paper will give examples of how risk-based decision making can improve operational efficiency and resource allocation.


2014 ◽  
Vol 27 (16) ◽  
pp. 6265-6287 ◽  
Author(s):  
Mitchell Bushuk ◽  
Dimitrios Giannakis ◽  
Andrew J. Majda

Abstract This paper studies spatiotemporal modes of variability of sea ice concentration and sea surface temperature (SST) in the North Pacific sector in a comprehensive climate model and observations. These modes are obtained via nonlinear Laplacian spectral analysis (NLSA), a recently developed data analysis technique for high-dimensional nonlinear datasets. The existing NLSA algorithm is modified to allow for a scale-invariant coupled analysis of multiple variables in different physical units. The coupled NLSA modes are utilized to investigate North Pacific sea ice reemergence: a process in which sea ice anomalies originating in the melt season (spring) are positively correlated with anomalies in the growth season (fall) despite a loss of correlation in the intervening summer months. It is found that a low-dimensional family of NLSA modes is able to reproduce the lagged correlations observed in sea ice data from the North Pacific Ocean. This mode family exists in both model output and observations and is closely related to the North Pacific gyre oscillation (NPGO), a low-frequency pattern of North Pacific SST variability. Moreover, this mode family provides a mechanism for sea ice reemergence in which summer SST anomalies store the memory of spring sea ice anomalies, allowing for sea ice anomalies of the same sign to appear in the fall season. Lagged correlations in model output and observations are significantly strengthened by conditioning on the NPGO mode being active, in either positive or negative phase. Another family of NLSA modes, related to the Pacific decadal oscillation (PDO), is found to capture a winter-to-winter reemergence of SST anomalies.


Author(s):  
H. Cathcart ◽  
G. Horne ◽  
J. Parkinson ◽  
A. Moffat ◽  
M. Joyce

Abstract Structural integrity assessments typically aim to calculate the integrity of a component under nominal or best estimate conditions. To account for potential variability and uncertainty present in the system, safety factors are often applied to assessment inputs and outputs. This approach does not allow the level of conservatism present to be quantified, often leading to over-conservatism or inadvertent non-conservatism. Probabilistic assessments explicitly calculate the probability of failure based on distributions of the input parameters and hence quantify the margin present in the assessment, leading to a greater understanding of the system. In this study a creep-fatigue damage assessment of a transiently loaded piping component is used as a vehicle to investigate some of the challenges and benefits of probabilistic assessments. A probabilistic assessment of the component life is compared to a lower-bound deterministic calculation to identify the mismatch in margin between the two results. The potential inaccuracies introduced when reducing the computational burden of Monte Carlo simulations with response surface methodologies are explored and tested. Finally, two challenges when attempting to underwrite a very low probability of failure are tackled: the inference of the shape of a distribution’s tails from limited experimental data and the uncertainty of extreme percentiles of finite Monte Carlo samples.


Author(s):  
Alex Nemeth ◽  
Sherif Hassanien ◽  
Len Leblanc

The use of integrity reliability science is becoming a prevalent element in the pipeline integrity management process. One of the key elements in this process is defining what integrity reliability targets to achieve in order to maintain the safety of the system. IPC2016-64425 presented different industry approaches around the area of defining reliability target levels for pipelines. It discussed the importance of setting operators’ specific integrity target reliability levels, how to choose such targets, and how to determine the safety of a pipeline asset by comparing the probability of failure (PoF) against an integrity permissible probability of failure (PoFp) while keeping an eye on the estimated expected number of failures. Building upon the previous discussion, this paper reviews a risk-based approach for estimating integrity reliability targets that account for the consequence of a potential release. Given available technical publications, the as low as reasonably practicable (ALARP) concept, and operators’ specific risk tolerances, there is room for improving the communication of integrity reliability along with selected targets. The paper describes how codes, standards, and operators set reliability targets, how operator specific targets can be chosen, and how industry currently recommends liquid pipelines reliability targets. Moreover, the paper proposes different approaches to define practical reliability targets coupled with an integrity risk-informed decision making framework.


Foods ◽  
2020 ◽  
Vol 9 (7) ◽  
pp. 851
Author(s):  
Mauricio Flores-Valdez ◽  
Ofelia Gabriela Meza-Márquez ◽  
Guillermo Osorio-Revilla ◽  
Tzayhri Gallardo-Velázquez

Food adulteration is an illegal practice performed to elicit economic benefits. In the context of roasted and ground coffee, legumes, cereals, nuts and other vegetables are often used to augment the production volume; however, these adulterants lack the most important coffee compound, caffeine, which has health benefits. In this study, the mid-infrared Fourier transform spectroscopy (FT-MIR) technique coupled with chemometrics was used to identify and quantify adulterants in coffee (Coffea arabica L.). Coffee samples were adulterated with corn, barley, soy, oat, rice and coffee husks, in proportions ranging from 1–30%. A discrimination model was developed using the soft independent modeling of class analogy (SIMCA) framework, and quantitative models were developed using such algorithms as the partial least squares algorithms with one variable (PLS1) and multiple variables (PLS2) and principal component regression (PCR). The SIMCA model exhibited an accuracy of 100% and could discriminate among all the classes. The quantitative model with the highest performance corresponded to the PLS1 algorithm. The model exhibited an R2c: ≥ 0.99, standard error of calibration (SEC) of 0.39–0.82, and standard error of prediction (SEP) of 0.45–0.94. The developed models could identify and quantify the coffee adulterants, and it was considered that the proposed methodology can be applied to identify and quantify the adulterants used in the coffee industry.


Author(s):  
Colin Scott

Cracks in close proximity may interact and lead to leaks or ruptures at pressures well below the predicted failure pressures of the individual cracks. Several industry organizations and standards, including CEPA, ASME, API, and British Standards provide guidance on the treatment of potentially interacting cracks. This guidance tends to be very conservative. This paper is a study of crack interaction, including a discussion of industry guidance, a critical review of failure pressure models, and a review of results of laboratory hydro-testing of pipe sections containing either in-service flaws or simulated flaws. In some cases the industry guidance and current failure pressure models provide inconsistent predictions, and this leads to uncertainty in the assessments used in routine crack management programs. The results of the hydro-testing are discussed in the context of both types of predictions. Understanding and predicting these interactions is important in maintaining an effective and efficient crack management program. The paper is aimed at engineers involved in integrity assessments and integrity management system process improvement.


Sign in / Sign up

Export Citation Format

Share Document