Data Analytics Software for Automatic Detection of Anomalies in Well Testing

2021 ◽  
Author(s):  
Stefano Capponi ◽  
Chiazor Nwachukwu

Abstract This paper will present a software that was developed to diagnose well test data. The software monitors the data, and through a series of algorithms alarms the user in case of discrepancies. This allows the user to investigate possible source of errors and correct them in real time. Several datasets from previous operations were analyzed and the basic physics governing how a certain datum depends on others were laid out. All the well test data traditionally acquired were put on a matrix, showing the dependencies between each datum and other physical properties that are available - either measured or modelled. Acceptable fluctuations in acquired data were also identified for use as tolerance limits. The software scans through the data as it is acquired and raises an alarm when the identified dependencies are broken. The software also identified which parameter is most likely causing the error. The software was built based on previous well test data and reports. Subsequently, two field trials were conducted to fine tune the algorithms and allowable data fluctuations. The process of validating the software consisted of: (1) Identifying flagged errors that should have not been flagged (dependencies set too tight); (2) identifying errors that should have been flagged and were not (dependencies set too loose); (3) improving the user interface for ease of use. The results were positive, with several improvements in the error recognition and several discrepancies flagged that would not have been caught by the naked eye. The user interface was also improved, allowing the user to clear error messages and provide input to improve the algorithm. The field trial also demonstrated that the methodology is scalable to other data acquisition plans and to more advanced analytics. The algorithms are simple, allowing the software to be implemented in all operations. More advanced algorithms are likely to depend on job specific data and parameters. Traditional data acquisition systems used during well test only present the data. Alarms trigger the user's attention only when certain defined operability limits are about to be reached. Being able to confirm that the data is cohesive during the well test prevents a loss of confidence in the results and painful post processing exercises. Moreover, given the algorithms used are based on simple physics, it is easy to deploy the software in any operation.

2021 ◽  
Author(s):  
Nagaraju Reddicharla ◽  
Subba Ramarao Rachapudi ◽  
Indra Utama ◽  
Furqan Ahmed Khan ◽  
Prabhker Reddy Vanam ◽  
...  

Abstract Well testing is one of the vital process as part of reservoir performance monitoring. As field matures with increase in number of well stock, testing becomes tedious job in terms of resources (MPFM and test separators) and this affect the production quota delivery. In addition, the test data validation and approval follow a business process that needs up to 10 days before to accept or reject the well tests. The volume of well tests conducted were almost 10,000 and out of them around 10 To 15 % of tests were rejected statistically per year. The objective of the paper is to develop a methodology to reduce well test rejections and timely raising the flag for operator intervention to recommence the well test. This case study was applied in a mature field, which is producing for 40 years that has good volume of historical well test data is available. This paper discusses the development of a data driven Well test data analyzer and Optimizer supported by artificial intelligence (AI) for wells being tested using MPFM in two staged approach. The motivating idea is to ingest historical, real-time data, well model performance curve and prescribe the quality of the well test data to provide flag to operator on real time. The ML prediction results helps testing operations and can reduce the test acceptance turnaround timing drastically from 10 days to hours. In Second layer, an unsupervised model with historical data is helping to identify the parameters that affecting for rejection of the well test example duration of testing, choke size, GOR etc. The outcome from the modeling will be incorporated in updating the well test procedure and testing Philosophy. This approach is being under evaluation stage in one of the asset in ADNOC Onshore. The results are expected to be reducing the well test rejection by at least 5 % that further optimize the resources required and improve the back allocation process. Furthermore, real time flagging of the test Quality will help in reduction of validation cycle from 10 days hours to improve the well testing cycle process. This methodology improves integrated reservoir management compliance of well testing requirements in asset where resources are limited. This methodology is envisioned to be integrated with full field digital oil field Implementation. This is a novel approach to apply machine learning and artificial intelligence application to well testing. It maximizes the utilization of real-time data for creating advisory system that improve test data quality monitoring and timely decision-making to reduce the well test rejection.


2021 ◽  
Vol 134 (3) ◽  
pp. 35-38
Author(s):  
A. M. Svalov ◽  

Horner’s traditional method of processing well test data can be improved by a special transformation of the pressure curves, which reduces the time the converted curves reach the asymptotic regimes necessary for processing these data. In this case, to take into account the action of the «skin factor» and the effect of the wellbore, it is necessary to use a more complete asymptotic expansion of the exact solution of the conductivity equation at large values of time. At the same time, this method does not allow to completely eliminate the influence of the wellbore, since the used asymptotic expansion of the solution for small values of time is limited by the existence of a singular point, in the vicinity of which the asymptotic expansion ceases to be valid. To solve this problem, a new method of processing well test data is proposed, which allows completely eliminating the influence of the wellbore. The method is based on the introduction of a modified inflow function to the well, which includes a component of the boundary condition corresponding to the influence of the wellbore.


1986 ◽  
Author(s):  
R.N. Horne ◽  
J.L. Perrick ◽  
J. Barua
Keyword(s):  

2021 ◽  
Author(s):  
Elias Temer ◽  
Deiveindran Subramaniam

Abstract Well test is one of the crucial steps required to forecast production investments of their fields. However, the operators face many challenges such as reduced capex, exploration budgets, and bad weather conditions that limit the well testing time window. To overcome these challenges, an automated well testing platform enabled a real time monitoring and controlling more zones in a single run for appraisal wells in the Sea of Okhotsk, Russia. This article highlights the test objectives, the job planning, and automated execution of wirelessly enabled operations in very hostile conditions and limited time period. The use of a telemetry system to well test seven zones allowed real-time data acquisition, control of critical downhole equipment, data transmission to the operator's office in town. Various operational cases will be discussed to demonstrate how automated data acquisition and downhole operations control has optimized operations for both the service company and the operator.


2000 ◽  
Vol 3 (04) ◽  
pp. 325-334 ◽  
Author(s):  
J.L. Landa ◽  
R.N. Horne ◽  
M.M. Kamal ◽  
C.D. Jenkins

Summary In this paper we present a method to integrate well test, production, shut-in pressure, log, core, and geological data to obtain a reservoir description for the Pagerungan field, offshore Indonesia. The method computes spatial distributions of permeability and porosity and generates a pressure response for comparison to field data. This technique produced a good match with well-test data from three wells and seven shut-in pressures. The permeability and porosity distributions also provide a reasonable explanation of the observed effects of a nearby aquifer on individual wells. As a final step, the method is compared to an alternate technique (object modeling) that models the reservoir as a two-dimensional channel. Introduction The Pagerungan field has been under commercial production since 1994. This field was chosen to test a method of integrating dynamic well data and reservoir description data because the reservoir has only produced single phase gas, one zone in the reservoir is responsible for most of the production, and good quality well-test, core, and log data are available for most wells. The method that was used to perform the inversion of the spatial distribution of permeability and porosity uses a parameter estimation technique that calculates the gradients of the calculated reservoir pressure response with respect to the permeability and porosity in each of the cells of a reservoir simulation grid. The method is a derivative of the gradient simulator1 approach and is described in Appendices A and B. The objective is to find sets of distributions of permeability and porosity such that the calculated response of the reservoir closely matches the pressure measurements. In addition, the distributions of permeability and porosity must satisfy certain constraints given by the geological model and by other information known about the reservoir. Statement of Theory and Definitions The process of obtaining a reservoir description involves using a great amount of data from different sources. It is generally agreed that a reservoir description will be more complete and reliable when it is the outcome of a process that can use the maximum possible number of data from different sources. This is usually referred to in the literature as "data Integration." Reservoir data can be classified as "static" or "dynamic" depending on their connection to the movement or flow of fluids in the reservoir. Data that have originated from geology, logs, core analysis, seismic and geostatistics can be generally classified as static; whereas the information originating from well testing and the production performance of the reservoir can be classified as dynamic. So far, most of the success in data integration has been obtained with static information. Remarkably, it has not yet become common to completely or systematically integrate dynamic data with static data. A number of researchers,2–5 are studying this problem at present. This work represents one step in that direction. Well Testing as a Tool for Reservoir Description. Traditional well-test analysis provides good insight into the average properties of the reservoir in the vicinity of a well. Well testing can also identify the major features of relatively simple reservoirs, such as faults, fractures, double porosity, channels, pinchouts, etc. in the near well area. The difficulties with this approach begin when it is necessary to use the well-test data on a larger scale, such as in the context of obtaining a reservoir description. One of the main reasons for these difficulties is that traditional well-test analysis handles transient pressure data collected at a single well at a time, and is restricted to a small time range. As a result, traditional well-test analysis does not make use of "pressure" events separated in historical time. The use of several single and multiple well tests to describe reservoir heterogeneity has been reported in the literature,6 however, this approach is not applied commonly because of the extensive efforts needed to obtain a reservoir description. The method presented in this paper uses a numerical model of the reservoir to overcome these shortcomings. It will be shown that pressure transients can be used effectively to infer reservoir properties at the scale of reservoir description. Well-test data, both complete tests and occasional spot pressure measurements, will be used to this effect. The well-test information allows us to infer properties close to the wells and, when combined with the shut-in pressures (spot pressure), boundary information and permeability-porosity correlations, provides the larger scale description. General Description of the Method The proposed method is similar to other parameter estimation methods and thus consists of the following major items: the mathematical model, the objective function and the minimization algorithm. Mathematical Model. Because of the complexity of the reservoir description, the reservoir response must be computed numerically. Therefore, the pressure response is found using a numerical simulator. The reservoir is discretized into blocks. The objective is to find a suitable permeability-porosity distribution so that values of these parameters can be assigned to each of the blocks.


2018 ◽  
Vol 488 (1) ◽  
pp. 237-257 ◽  
Author(s):  
Patrick William Michael Corbett ◽  
Gleyden Lucila Benítez Duarte

AbstractTwo decades of geological modelling have resulted in the ability to study single-well geological models at a sufficiently high resolution to generate synthetic well test responses from numerical simulations in realistic geological models covering a range of fluvial styles. These 3D subsurface models are useful in aiding our understanding and mapping of the geological variation (as quantified by porosity and permeability contrasts) in the near-wellbore region. The building and analysis of these models enables many workflow steps, from matching well test data to improving history-matching. Well testing also has a key potential role in reservoir characterization for an improved understanding of the near-wellbore subsurface architecture in fluvial systems. Developing an understanding of well test responses from simple through increasingly more complex geological scenarios leads to a realistic, real-life challenge: a well test in a small fluvial reservoir. The geological well testing approach explained here, through a recent fluvial case study in South America, is considered to be useful in improving our understanding of reservoir performance. This approach should lead to more geologically and petrophysically consistent models, and to geologically assisted models that are both more correct and quicker to match to history, and thus, ultimately, to more useful reservoir models. It also allows the testing of a more complex geological model through the well test response.


2021 ◽  
Author(s):  
Elias Temer ◽  
Nahomi Zerpa Mendez ◽  
Yermek Kaipov

Abstract The oil industry has been perpetually examining well testing methods, with the goal of improving overall efficiency, ensuring data quality, and streamlining processes to achieve program objectives. Over the years, the aim of drillstem testing (DST) has remained mostly unchanged. However, operators want to meet the forecasted production investments of their fields, while improving operational efficiency and maintaining the highest level of operational standards, with safety and the environment being paramount. One of the solutions was developing a live, downhole, reservoir testing platform. The breakthrough consisted in introducing automation and real time monitoring to adjust the test program according to the actual reservoir response rather than blindly following a predefined test program, necessitating better operational flexibility. This platform is united by a wireless telemetry technology allowing an acoustic communication with downhole tools in real time. The automation of the data acquisition, downhole tools actuation and real time monitoring of the downhole operations, gives the operators the ability to perform well tests with reduced uncertainties, less human intervention and improved data quality. The early availability of reservoir knowledge enables operational efficiencies by meeting the test objectives earlier, thus reducing significantly the overall test period and the associated well testing costs. This paper describes the common well test objectives and challenges, the overall design of the wireless telemetry system, and automation of the job preparation and execution of the downhole operations that led to the successful completion of the well test campaign in very hostile condition, remote areas and restricted period. The use of the telemetry system in several well testing campaigns in different regions of the world, allowed to control critical downhole equipment and to acquire reservoir data transmittable to the clients office in town in real time. Various operation examples will be discussed to demonstrate how the automated data acquisition and downhole operations control has been used to optimize operations.


2019 ◽  
Vol 3 (2) ◽  
pp. 111-118
Author(s):  
Bahtiar Wilantara ◽  
Raharjo Raharjo

This study aims to develop an analog compression tester measuring instrument into a digital compression tester as a measurement tool that can provide effectiveness and efficiency to users.                     This research is a research and development or R&D. This research was conducted in several steps, namely: problem identification, information gathering, product design, product manufacture, expert validation, product revision, testing, final production. The development of analog compression tester was first validated by material experts, media experts, and 15 students, and 5 students for field trials. The subjects of this study were vocational students at Taman Karya Madya Teknik Kebumen. Data collection techniques used in this study using instruments in the form of a questionnaire. The data analysis technique of this research is descriptive qualitative and quantitative descriptive percentage.                 The results of the development of digital compression tester designs are: 1) the tools and materials used are electric drill, grinding, cutter, goggles, gloves, masks, ruler, acetaminine welding, screwdriver, scissors, digital dial pressure gauge, hose, spark plugs, clamps , and nepel, 2) the manufacturing process that starts from the cutting process, the hole drilling process, the welding process and the process of connecting between components, 3) the workings of digital compression tester design that is reading the pressure or compression of the machine displayed on the monitor digitally using dial pressure digital gauge, 4) the test results obtained from the validation results from: a) material experts at 89% or Eligible; b) media experts at 85% or reasonable; c) response of field trial students in terms of ease of use and reading of 90% or feasible. Thus, the conclusion that the digital compression tester measuring instrument declared feasible to use for measurement.


SPE Journal ◽  
1996 ◽  
Vol 1 (02) ◽  
pp. 145-154 ◽  
Author(s):  
Dean S. Oliver

Sign in / Sign up

Export Citation Format

Share Document