Making Integrity Decisions Using Metal Loss ILI Validation Process

Author(s):  
Yanping Li ◽  
Gordon Fredine ◽  
Yvan Hubert ◽  
Sherif Hassanien

With the increased number of In-Line Inspections (ILI) on pipelines, it is important to evaluate ILI tool performance to support making rational integrity decisions. API 1163 “In-Line inspection systems qualification” outlines an ILI data set validation process which is mainly based on comparing ILI data with field measurements. The concept of comparing ILI results with previous ILI data is briefly mentioned in API 1163 Level 1 validation and discussed in detail in CEPA metal Loss ILI tool validation guidance document. However, a different approach from API 1163 is recommended in the CEPA document. Although the methodologies of validating an ILI performance are available, other than determining whether an inspection data set is acceptable, the role of ILI validation in integrity management decision making is not well defined in these documents. Enbridge has reviewed API 1163 and CEPA methodologies and developed a process to validate metal loss ILI results. This process uses API 1163 as tool performance acceptance criteria while CEPA method is used to provide additional information such as depth over-call or under-call. The process captures the main concepts of both API 1163 and CEPA methodologies. It adds a new dimension to the validation procedure by evaluating different corrosion morphologies, depth ranges, and proximity to long seam and girth weld. The process also checks ILI results against previous ILI data sets and combines the results of several inspections. The validation results of one inspection provide information on whether the inspection data set is acceptable based on the ILI specification. This information is useful for excavation selection. Tool performance review based on several inspection data sets identifies the strength and weakness of an inspection tool; this information will be used to ensure the tool selection is appropriate for the expected feature types on the pipeline. Applications of the validation process are provided to demonstrate how the process can aid in making integrity decisions and managing metal loss threats.

Author(s):  
James Simek ◽  
Jed Ludlow ◽  
Phil Tisovec

InLine Inspection (ILI) tools using the magnetic flux leakage (MFL) technique are the most common type used for performing metal loss surveys worldwide. Based upon the very robust and proven magnetic flux leakage technique, these tools have been shown to operate reliably in the extremely harsh environments of transmission pipelines. In addition to metal loss, MFL tools are capable of identifying a broad range of pipeline features. Most MFL surveys to date have used tools employing axially oriented magnetizers, capable of detecting and quantifying many categories of volumetric metal loss features. For certain classes of axially oriented features, MFL tools using axially oriented fields have encountered difficulty in detection and subsequent quantification. To address features in these categories, tools employing circumferential or transversely oriented fields have been designed and placed into service, enabling enhanced detection and sizing for axially oriented features. In most cases, multiple surveys are required, as current tools do not incorporate the ability to collect both data sets concurrently. Applying the magnetic field in an oblique direction will enable detection of axially oriented features and may be used simultaneously with an axially oriented tool. Referencing previous research in adapting circumferential or transverse designs for inline service, the concept of an oblique field magnetizer will be presented. Models developed demonstrating the technique are discussed, shown with experimental data supporting the concept. Efforts involved in the implementation of an oblique magnetizer, including magnetic models for field profiles used to determine magnetizer configurations and sensor locations are presented. Experimental results are provided detailing the response of the system to a full range of metal loss features, supplementing modeling in an effort to determine the effects of variables introduced by magnetic property and velocity induced differences. Included in the experimental data results are extremely narrow axially oriented features, many of which are not detected or identified within the axial data set. Experimental and field verification results for detection accuracies will be described in comparison to an axial field tool.


Author(s):  
Luis A. Torres ◽  
Matthew J. Fowler ◽  
Jordan G. Stenerson

Integrity management of dents on pipelines is currently performed through the interpretation of In-Line Inspection (ILI) data; this includes Caliper, Magnetic Flux Leakage (MFL), and Ultrasonic Testing (UT) tools. Based on the available ILI data, dent features that are recognized as threats from a mechanical damage perspective are excavated and remediated. Federal codes and regulations provide rules and allow inference on what types of dent features may be a result of mechanical damage; nonetheless, there are challenges associated with identifying dents resulting from mechanical damage. One of the difficulties when managing the mechanical damage threat is the lack of information on how MFL and UT ILI tool performance is affected by dented areas in the pipe. ILI vendors do not offer any technical specifications for characterizing and sizing metal loss features in dents. It is generally expected that metal loss tool performance will be affected in dented areas of the pipe, but it is not known to what degree. It is likely that degradation will vary based on feature shape, sensor design, and sensor placement. Because metal loss tool performance is unknown within the limits of the dented pipe, other methods for recognizing mechanical damage have been incorporated into the management strategies of mechanical damage. Some of these methods include strain based assessments and characterization of shape complexity. In order to build a more effective integrity management program for mechanical damage, it is of critical importance to understand how tool technology performance is affected by dented areas in the pipe and what steps can be taken to use ILI information more effectively. In this paper, the effectiveness of MFL and UT wall measurement tools in characterizing and sizing metal loss features within dents is studied by evaluating against field results from non-destructive examinations of mechanical damage indications. In addition, the effectiveness of using shape complexity indicators to identify mechanical damage is evaluated, introducing concepts such as dents in close proximity and multi-apex dents. Finally, the effectiveness of ILI tools in predicting dent association with girth welds is also explored by comparing ILI and field results.


Author(s):  
Carl Legleiter ◽  
Brandon Overstreet

The Snake River is a central component of Grand Teton National Park, and this dynamic fluvial system plays a key role in shaping the landscape and maintaining a diversity of habitat conditions. The river’s inherent variability and propensity for change complicate effective characterization of this important resource, however; conventional, ground-based methods are not adequate for this purpose. Remote sensing provides an appealing alternative that could facilitate resource management while providing novel insight on factors influencing channel form and behavior. This study evaluates the potential for using optical data to measure the morphology and dynamics of a large, complex river such as the Snake. More specifically, we assessed the feasibility of estimating flow depth from multispectral satellite images acquired in September 2011. Our initial results indicate that reliable maps of river bathymetry can be produced from such data. We are also examining channel changes associated with a prolonged period of high flow during the 2011 snowmelt runoff season by comparing these satellite images with digital aerial photography from August 2010. An extensive field data set on flow velocities provides some hydraulic context for the observed morphodynamics. More sophisticated hyperspectral and LiDAR data sets are scheduled for collection in 2012, along with additional field measurements.


Author(s):  
Kevin Spencer ◽  
Shahani Kariyawasam ◽  
Cathy Tetreault ◽  
Jon Wharf

Corrosion growth rates are an essential input into an Integrity Management Program but they can often be the largest source of uncertainty and error. A relatively simple method to estimate a corrosion growth rate is to compare the size of a corrosion anomaly over time and the most practical way to do this for a whole pipeline system is via the use of In-Line Inspection (ILI). However, the reported depth of the anomaly following an ILI run contains measurement uncertainties, i.e., sizing tolerances that must be accounted for in defining the uncertainty, or error associated with the measured corrosion growth rate. When the same inspection vendor performs the inspections then proven methods exist that enable this growth error to be significantly reduced but these methods include the use of raw inspection data and, specialist software and analysis. Guidelines presently exist to estimate corrosion growth rates using inspection data from different ILI vendors. Although well documented, they are often only applicable to “simple” cases, pipelines containing isolated corrosion features with low feature density counts. As the feature density or the corrosion complexity increases then different reporting specifications, interaction rules, analysis procedures, sizing models, etc can become difficult to account for, ultimately leading to incorrect estimations or larger uncertainties regarding the growth error. This paper will address these issues through the experiences of a North American pipeline operator. Accurately quantifying the reliability of pipeline assets over time requires accurate corrosion growth rates and the case study will demonstrate how the growth error was significantly reduced over existing methodologies. Historical excavation and recoat information was utilized to identify static defects and quantify systemic bias between inspections. To reduce differences in reporting and the analyst interpretation of the recorded magnetic signals, novel analysis techniques were employed to normalize the data sets against each other. The resulting uncertainty of the corrosion growth rates was then further reduced by deriving, and applying a regression model to reduce the effect of the different sizing models and the identified systemic bias. The reduced uncertainty ultimately led to a better understanding of the corrosion activity on the pipeline and facilitated a better integrity management decision process.


1996 ◽  
Vol 31 (3) ◽  
pp. 609-622
Author(s):  
Efraim Halfon

Abstract Data Animator, V1.0, is a scientific visualization package for microcomputers. Its main purpose is to generate two-dimensional animations from any data set collected over time. Geographical references such as a shore and/or bathymetry information, etc., may be added for additional clarity. Visualization of data as animations greatly simplifies the interpretation of field measurements. Data Animator is designed (but not restricted) to display data collected in aquatic environments, lakes, rivers, estuaries, oceans, etc., in a clear, concise way using colour to represent ranges of data values. Data sets can also be displayed as static images (keyframes). A graphic user interface allows the user to choose viewpoint, fonts, colour palette, data and keyframes. All Data Animator's options can be accessed through a graphical user interface (GUI). Point-and-click mouse operations allow the user to manipulate many features, with immediate on-screen feedback. Animations are generated by defining keyframes of known data, each located at a specific time. The program can then interpolate over time, between keyframes, to create smoothly animated transitions (in-between frames). Two types of graphs can be rendered with Data Animator. Plane-type graphs are horizontal slices at a depth specified by the user. Transect-type graphs are vertical slices along a straight line defined by the user. Data Animator can make use of both shore outline information and three-dimensional bathymetry information. This allows for the generation of realistic-looking graphs that follow the shape of the aquatic environment. Animations can be displayed on a computer monitor or transferred to video tape. pH data from Hamilton Harbour have been visualized and the results are discussed.


Author(s):  
E´rika S. M. Nicoletti ◽  
Ricardo D. de Souza

Pipeline operators used to map and quantify corrosion damage along their aging pipeline systems by carrying out periodical in-line metal-loss inspections. Comparison with the data sets from subsequent runs of such inspections is one of the most reliable techniques to infer representative corrosion growth rates throughout the pipeline length, within the period between two inspections. Presently there are two distinct approaches to infer corrosion rates based on multiple in-line inspections: individual comparison of the detected defective areas (quantified by more than one inspection), and comparison between populations. The former usually requires a laborious matching process between the run-data sets, while the drawback of the latter is that it often fails to notice hot-spot areas. The object of this work is to present a new methodology which allows quick data comparison of two runs, while still maintaining local distinct characteristics of the corrosion process severity. There are three procedures that must be performed. Firstly, ILI metal-loss data set should be submitted to a filtering/adjustment process, taking into consideration the reporting threshold consistency; the possible existence of systematic bias and corrosion mechanisms similarity. Secondly, the average metal-loss growth rate between inspections should be determined based on the filtered populations. Thirdly, the defects reported by the latest inspection should have their corrosion growth rates individually determined as a function of the mean depth values of the whole population and in the defect neighborhood. The methodology allows quick and realistic damage-progression estimates, endeavoring to achieve more cost-effective and reliable strategies for the integrity management of aged corroded systems. Model robustness and general feasibility is demonstrated in a real case study.


Author(s):  
Perry Barham ◽  
Bryce Brown ◽  
Martine Fingerhut ◽  
Patrick Porter

For many years, BP Pipelines, North America has used high-resolution Magnetic Flux Leakage (MFL) in-line inspection (ILI) technology to help maintain the integrity of their pipelines. The improvements in this technology that now allow an Operator to make integrity decisions also bring challenges. Reports from ILI can list thousands, or even hundreds of thousands, of individual anomalies or features. When combined with data from NDT field measurements and existing pipe tallies, it can become overwhelming. Methods had to be developed to distill this information for further analysis. BP Pipelines NA encouraged cooperation between all parties involved in the integrity process to adapt reporting requirements and work procedures to provide the best available information for integrity analysis and to ensure continued improvements. This cooperation is a key part of the integrity equation and essential to a successful program. This paper presents an overview of the validation process undertaken on a 51 km (32-mile) section of 457 mm (18-inch) pipeline. This pipe section was inspected in 1999 and again in 2003 by the same inspection company. This provided an opportunity to evaluate improvements in inspection technology, assess repeatability of performance and develop an engineering based approach to review, analyze, and validate high-resolution metal loss MFL data. Field verification and data validation included the use of a several NDE techniques to acquire field measurements to overlay and compare to the ILI inspection data. Anomaly classification and distribution is examined and methods of selecting validation locations for future inspection developed. In addition to the primary goal outlined, the 2003 repair program provided an opportunity to evaluate the performance of the composite sleeve reinforcements applied in 1999, after 4 years of service.


2018 ◽  
Vol 154 (2) ◽  
pp. 149-155
Author(s):  
Michael Archer

1. Yearly records of worker Vespula germanica (Fabricius) taken in suction traps at Silwood Park (28 years) and at Rothamsted Research (39 years) are examined. 2. Using the autocorrelation function (ACF), a significant negative 1-year lag followed by a lesser non-significant positive 2-year lag was found in all, or parts of, each data set, indicating an underlying population dynamic of a 2-year cycle with a damped waveform. 3. The minimum number of years before the 2-year cycle with damped waveform was shown varied between 17 and 26, or was not found in some data sets. 4. Ecological factors delaying or preventing the occurrence of the 2-year cycle are considered.


2018 ◽  
Vol 21 (2) ◽  
pp. 117-124 ◽  
Author(s):  
Bakhtyar Sepehri ◽  
Nematollah Omidikia ◽  
Mohsen Kompany-Zareh ◽  
Raouf Ghavami

Aims & Scope: In this research, 8 variable selection approaches were used to investigate the effect of variable selection on the predictive power and stability of CoMFA models. Materials & Methods: Three data sets including 36 EPAC antagonists, 79 CD38 inhibitors and 57 ATAD2 bromodomain inhibitors were modelled by CoMFA. First of all, for all three data sets, CoMFA models with all CoMFA descriptors were created then by applying each variable selection method a new CoMFA model was developed so for each data set, 9 CoMFA models were built. Obtained results show noisy and uninformative variables affect CoMFA results. Based on created models, applying 5 variable selection approaches including FFD, SRD-FFD, IVE-PLS, SRD-UVEPLS and SPA-jackknife increases the predictive power and stability of CoMFA models significantly. Result & Conclusion: Among them, SPA-jackknife removes most of the variables while FFD retains most of them. FFD and IVE-PLS are time consuming process while SRD-FFD and SRD-UVE-PLS run need to few seconds. Also applying FFD, SRD-FFD, IVE-PLS, SRD-UVE-PLS protect CoMFA countor maps information for both fields.


Author(s):  
Kyungkoo Jun

Background & Objective: This paper proposes a Fourier transform inspired method to classify human activities from time series sensor data. Methods: Our method begins by decomposing 1D input signal into 2D patterns, which is motivated by the Fourier conversion. The decomposition is helped by Long Short-Term Memory (LSTM) which captures the temporal dependency from the signal and then produces encoded sequences. The sequences, once arranged into the 2D array, can represent the fingerprints of the signals. The benefit of such transformation is that we can exploit the recent advances of the deep learning models for the image classification such as Convolutional Neural Network (CNN). Results: The proposed model, as a result, is the combination of LSTM and CNN. We evaluate the model over two data sets. For the first data set, which is more standardized than the other, our model outperforms previous works or at least equal. In the case of the second data set, we devise the schemes to generate training and testing data by changing the parameters of the window size, the sliding size, and the labeling scheme. Conclusion: The evaluation results show that the accuracy is over 95% for some cases. We also analyze the effect of the parameters on the performance.


Sign in / Sign up

Export Citation Format

Share Document