Confidence Limits on the Parameters and Predictions of Slightly Compressible, Single-Phase Reservoirs

1977 ◽  
Vol 17 (01) ◽  
pp. 42-56 ◽  
Author(s):  
A.H. Dogru ◽  
T.N. Dixon ◽  
T.F. Edgar

Abstract Methods of nonlinear regression theory were applied to the reservoir history-matching problem to determine the effect of erroneous problem to determine the effect of erroneous parameter estimates obtained from well testing parameter estimates obtained from well testing on the future prediction of reservoir pressures. Two examples were studied: well testing in a radial one-dimensional slightly compressible reservoir and in an undersaturated, two-dimensional, heterogeneous oil field. The reservoir parameters of permeability, porosity, external radius, and pore volume were considered, and the effects of pore volume were considered, and the effects of measurement error, test time, and flow rate on the confidence limits were computed. Introduction The operation of a reservoir simulator requires accurate estimates of the reservoir properties. However, the simulation parameters, such as permeability, porosity, and reservoir geometry, are permeability, porosity, and reservoir geometry, are usually unknown unless coring and physical property analysis have been undertaken. Because of the cost of these procedures, it is more desirable to use the pressures measured at the well during a well test pressures measured at the well during a well test and indirectly compute the important parameters of the system. By using history matching of the test data to obtain the system parameters, the future pressure behavior of the reservoir can be predicted pressure behavior of the reservoir can be predictedSeveral studies on history matching have indicated that the welltest approach for determining the reservoir parameters often suffers from incorrect and nonunique parameter estimates. The factors that affect the parameter estimation can be classified as model errors, observability, measurement errors or noise, history time, test procedure, and optimization procedure. Model errors arise from the inaccuracy of the model and the numerical integration. For example, a reservoir simulator is only a reasonable approximation for flow through porous media. Solution of a model equation by numerical means also introduces roundoff and discretization errors. Observability of the system plays an important role in estimating the reservoir parameters. Depending on the location of the well and the number of data points, it may not be possible to determine uniquely all reservoir parameters from the measurements made at that well. Observability is strictly a function of the reservoir model used. At a given well, pressure measurements may only reflect the values of the parameters in specific zones of the reservoir. If a specific zone away from the well does not affect the measured pressure, then the system is not observable at that particular location. A rigorous definition of observability can be found in other papers. Measurement errors in the pressures and flow rates are another source of unrealistic parameter estimates. Longer history times always give more information about the reservoir as long as the system remains in a dynamic state. The nature of the system input (well flow rate) also affects the accuracy of the estimates and predictions. The final source of incorrect parameter estimates arises because the history-matching problem, posed mathematically, is usually a nonlinear programming problem that must be solved computationally. Such problem that must be solved computationally. Such a problem yields multiple extrema that often can lead to a relative minimum (rather than a global minimum) in the numerical search for the smallest matching error. Also, the magnitude of the objective function can be quite insensitive to the parameters selected, thus causing the optimization procedure to terminate prematurely. The above factors control the history-matching process; with actual data, it is usually impossible process; with actual data, it is usually impossible to identify the exact contributions of each factor to the errors in the parameter estimates. Since a certain amount of error will be introduced into the estimated parameters from the history-matching process, it is parameters from the history-matching process, it is useful to study the magnitude of this error resulting from various sources under controlled simulation conditions. Also, it is important to determine how the errors in the parameters are reflected in the future predictions of the pressures. SPEJ P. 42

Author(s):  
Geir Evensen

AbstractIt is common to formulate the history-matching problem using Bayes’ theorem. From Bayes’, the conditional probability density function (pdf) of the uncertain model parameters is proportional to the prior pdf of the model parameters, multiplied by the likelihood of the measurements. The static model parameters are random variables characterizing the reservoir model while the observations include, e.g., historical rates of oil, gas, and water produced from the wells. The reservoir prediction model is assumed perfect, and there are no errors besides those in the static parameters. However, this formulation is flawed. The historical rate data only approximately represent the real production of the reservoir and contain errors. History-matching methods usually take these errors into account in the conditioning but neglect them when forcing the simulation model by the observed rates during the historical integration. Thus, the model prediction depends on some of the same data used in the conditioning. The paper presents a formulation of Bayes’ theorem that considers the data dependency of the simulation model. In the new formulation, one must update both the poorly known model parameters and the rate-data errors. The result is an improved posterior ensemble of prediction models that better cover the observations with more substantial and realistic uncertainty. The implementation accounts correctly for correlated measurement errors and demonstrates the critical role of these correlations in reducing the update’s magnitude. The paper also shows the consistency of the subspace inversion scheme by Evensen (Ocean Dyn. 54, 539–560 2004) in the case with correlated measurement errors and demonstrates its accuracy when using a “larger” ensemble of perturbations to represent the measurement error covariance matrix.


Mathematics ◽  
2021 ◽  
Vol 9 (24) ◽  
pp. 3289
Author(s):  
Emil N. Musakaev ◽  
Sergey P. Rodionov ◽  
Nail G. Musakaev

A three-dimensional numerical hydrodynamic model fairly accurately describes the processes of developing oil and gas fields, and has good predictive properties only if there are high-quality input data and comprehensive information about the reservoir. However, under conditions of high uncertainty of the input data, measurement errors, significant time and resource costs for processing and analyzing large amounts of data, the use of such models may be unreasonable and can lead to ill-posed problems: either the uniqueness of the solution or its stability is violated. A well-known method for dealing with these problems is regularization or the method of adding some additional a priori information. In contrast to full-scale modeling, currently there is active development of reduced-physics models, which are used, first of all, in conditions when it is required to make an operational decision, and computational resources are limited. One of the most popular simplified models is the material balance model, which makes it possible to directly capture the relationship between reservoir pressure, flow rates and the integral reservoir characteristics. In this paper, it is proposed to consider a hierarchical approach when solving the problem of oil field waterflooding control using material balance models in successive approximations: first for the field as a whole, then for hydrodynamically connected blocks of the field, then for wells. When moving from one level of model detailing to the next, the modeling results from the previous levels of the hierarchy are used in the form of additional regularizing information, which ultimately makes it possible to correctly solve the history matching problem (identification of the filtration model) in conditions of incomplete input information.


1974 ◽  
Vol 14 (06) ◽  
pp. 593-608 ◽  
Author(s):  
W.H. Chen ◽  
G.R. Gavalas ◽  
J.H. Seinfeld ◽  
M.L. Wasserman

Abstract History-matching problems, in which reservoir parameters are to be estimated from well pressure parameters are to be estimated from well pressure data, are formulated as optimal control problems. The necessary conditions for optimality lead naturally to gradient optimization methods for determining the optimal parameter estimates. the key feature of the approach is that reservoir properties are considered as continuous functions properties are considered as continuous functions of position rather than as uniform in a certain number of zones. The optimal control approach is illustrated on a hypothetical reservoir and on an actual Saudi Arabian reservoir, both characterized by single-phase flow. A significant saving in computing time over conventional constant-zone gradient optimization methods is demonstrated. Introduction The process of determining in a mathematical reservoir model unknown parameter valuessuch as permeability and porositythat give the closest permeability and porositythat give the closest fit of measured and calculated pressures is commonly called "history matching." In principle, one would like an automatic routine for history matching, applicable to simulators of varying complexity, one that does not require inordinate amounts of computing time to achieve a set of parameter estimates. In recent years a number of authors have investigated the subject of history matching. All the reported approaches involve dividing the reservoir into a number of zones, in each of which the properties to be estimated are assumed to be uniform. (These zones may, in fact, correspond to the spatial grid employed for the finite-difference solution of the simulator.) Then the history-matching problem becomes that of determining the parameter problem becomes that of determining the parameter values in each of, say, N zones, k1, k2, ..., kN, in such a way that some measure (usually a sum of squares) of the deviation between calculated and observed pressures is minimized. A typical measure of deviation pressures is minimized. A typical measure of deviation is(1) where p obs (j, ti) and p cal (j, ti) are the observed and calculated pressures at the jth well, which is at location j=(xj, yj), j = 1,2,......, M, and where we have n1 measurements at Well 1 at n1 different times, n2 measurements at Well 2 at n2 different times, . . ., and nM measurements at Well M at nM different times. To carry out the minimization of Eq. 1 with respect to the vector k, most methods rely on some type of gradient optimization procedure that requires computation of the gradient of J with respect to each ki, i = 1, 2, . . ., N. The calculation of J/ ki usually requires, in turn, that one obtain the sensitivity coefficients, p cal/ ki, i = 1, 2, . . ., N; i.e., the first partial derivative of pressure with respect to each parameter. The sensitivity coefficients can be computed, in principle, in several ways. 1. Make a simulator base run with all N parameters at their initial values. Then, perturbing each parameter a small amount, make an additional simulator run for each parameter in the system. parameter in the system. SPEJ P. 593


2018 ◽  
Vol 488 (1) ◽  
pp. 237-257 ◽  
Author(s):  
Patrick William Michael Corbett ◽  
Gleyden Lucila Benítez Duarte

AbstractTwo decades of geological modelling have resulted in the ability to study single-well geological models at a sufficiently high resolution to generate synthetic well test responses from numerical simulations in realistic geological models covering a range of fluvial styles. These 3D subsurface models are useful in aiding our understanding and mapping of the geological variation (as quantified by porosity and permeability contrasts) in the near-wellbore region. The building and analysis of these models enables many workflow steps, from matching well test data to improving history-matching. Well testing also has a key potential role in reservoir characterization for an improved understanding of the near-wellbore subsurface architecture in fluvial systems. Developing an understanding of well test responses from simple through increasingly more complex geological scenarios leads to a realistic, real-life challenge: a well test in a small fluvial reservoir. The geological well testing approach explained here, through a recent fluvial case study in South America, is considered to be useful in improving our understanding of reservoir performance. This approach should lead to more geologically and petrophysically consistent models, and to geologically assisted models that are both more correct and quicker to match to history, and thus, ultimately, to more useful reservoir models. It also allows the testing of a more complex geological model through the well test response.


Solar Energy ◽  
1991 ◽  
Vol 47 (1) ◽  
pp. 1-16 ◽  
Author(s):  
B. Bourges ◽  
A. Rabl ◽  
B. Leide ◽  
M.J. Carvalho ◽  
M. Collares-Pereira

SPE Journal ◽  
2017 ◽  
Vol 22 (05) ◽  
pp. 1506-1518 ◽  
Author(s):  
Pedram Mahzari ◽  
Mehran Sohrabi

Summary Three-phase flow in porous media during water-alternating-gas (WAG) injections and the associated cycle-dependent hysteresis have been subject of studies experimentally and theoretically. In spite of attempts to develop models and simulation methods for WAG injections and three-phase flow, current lack of a solid approach to handle hysteresis effects in simulating WAG-injection scenarios has resulted in misinterpretations of simulation outcomes in laboratory and field scales. In this work, by use of our improved methodology, the first cycle of the WAG experiments (first waterflood and the subsequent gasflood) was history matched to estimate the two-phase krs (oil/water and gas/oil). For subsequent cycles, pertinent parameters of the WAG hysteresis model are included in the automatic-history-matching process to reproduce all WAG cycles together. The results indicate that history matching the whole WAG experiment would lead to a significantly improved simulation outcome, which highlights the importance of two elements in evaluating WAG experiments: inclusion of the full WAG experiments in history matching and use of a more-representative set of two-phase krs, which was originated from our new methodology to estimate two-phase krs from the first cycle of a WAG experiment. Because WAG-related parameters should be able to model any three-phase flow irrespective of WAG scenarios, in another exercise, the tuned parameters obtained from a WAG experiment (starting with water) were used in a similar coreflood test (WAG starting with gas) to assess predictive capability for simulating three-phase flow in porous media. After identifying shortcomings of existing models, an improved methodology was used to history match multiple coreflood experiments simultaneously to estimate parameters that can reasonably capture processes taking place in WAG at different scenarios—that is, starting with water or gas. The comprehensive simulation study performed here would shed some light on a consolidated methodology to estimate saturation functions that can simulate WAG injections at different scenarios.


2021 ◽  
Author(s):  
Gabriela Chaves ◽  
Danielle Monteiro ◽  
Virgilio José Martins Ferreira

Abstract Commingle production nodes are standard practice in the industry to combine multiple segments into one. This practice is adopted at the subsurface or surface to reduce costs, elements (e.g. pipes), and space. However, it leads to one problem: determine the rates of the single elements. This problem is recurrently solved in the platform scenario using the back allocation approach, where the total platform flowrate is used to obtain the individual wells’ flowrates. The wells’ flowrates are crucial to monitor, manage and make operational decisions in order to optimize field production. This work combined outflow (well and flowline) simulation, reservoir inflow, algorithms, and an optimization problem to calculate the wells’ flowrates and give a status about the current well state. Wells stated as unsuited indicates either the input data, the well model, or the well is behaving not as expected. The well status is valuable operational information that can be interpreted, for instance, to indicate the need for a new well testing, or as reliability rate for simulations run. The well flowrates are calculated considering three scenarios the probable, minimum and maximum. Real-time data is used as input data and production well test is used to tune and update well model and parameters routinely. The methodology was applied using a representative offshore oil field with 14 producing wells for two-years production time. The back allocation methodology showed robustness in all cases, labeling the wells properly, calculating the flowrates, and honoring the platform flowrate.


2021 ◽  
Vol 73 (04) ◽  
pp. 60-61
Author(s):  
Chris Carpenter

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17–19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17-19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction The concept of rate transient analysis (RTA) involves the use of rate and pressure trends of producing wells to estimate properties such as permeability and fracture surface area. While very useful, RTA is an analytical technique and has commensurate limitations. In the complete paper, different RTA motifs are generated using a simulator. Insights from these motif simulations are used to modify simulation parameters to expediate and inform the history- matching process. The simulation history-matching work flow presented includes the following steps: 1 - Set up a simulation model with geologic properties, wellbore and completion designs, and fracturing and production schedules 2 - Run an initial model 3 - Tune the fracture geometries (height and length) to heuristic data: microseismic, frac-hit data, distributed acoustic sensing, or other diagnostics 4 - Match instantaneous shut-in pressure (ISIP) and wellhead pressure (WHP) during injection 5 - Make RTA plots of the real and simulated production data 6 - Use the motifs presented in the paper to identify possible production mechanisms in the real data 7 - Adjust history-matching parameters in the simulation model based on the intuition gained from RTA of the real data 8 -Iterate Steps 5 through 7 to obtain a match in RTA trends 9 - Modify relative permeabilities as necessary to obtain correct oil, water, and gas proportions In this study, the authors used a commercial simulator that fully integrates hydraulic fracturing, wellbore, and reservoir simulation into a single modeling code. Matching Fracturing Data The complete paper focuses on matching production data, assisted by RTA, not specifically on the matching of fracturing data such as injection pressure and fracture geometry (Steps 3 and 4). Nevertheless, for completeness, these steps are very briefly summarized in this section. Effective fracture toughness is the most-important factor in determining fracture length. Field diagnostics suggest considerable variability in effective fracture toughness and fracture length. Typical half-lengths are between 500 and 2,000 ft. Laboratory-derived values of fracture toughness yield longer fractures (propagation of 2,000 ft or more from the wellbore). Significantly larger values of fracture toughness are needed to explain the shorter fracture length and higher net pressure values that are often observed. The authors use a scale- dependent fracture-toughness parameter to increase toughness as the fracture grows. This allows the simulator to match injection pressure data while simultaneously limiting fracture length. This scale-dependent toughness scaling parameter is the most-important parameter in determining fracture size.


2021 ◽  
Author(s):  
Nagaraju Reddicharla ◽  
Subba Ramarao Rachapudi ◽  
Indra Utama ◽  
Furqan Ahmed Khan ◽  
Prabhker Reddy Vanam ◽  
...  

Abstract Well testing is one of the vital process as part of reservoir performance monitoring. As field matures with increase in number of well stock, testing becomes tedious job in terms of resources (MPFM and test separators) and this affect the production quota delivery. In addition, the test data validation and approval follow a business process that needs up to 10 days before to accept or reject the well tests. The volume of well tests conducted were almost 10,000 and out of them around 10 To 15 % of tests were rejected statistically per year. The objective of the paper is to develop a methodology to reduce well test rejections and timely raising the flag for operator intervention to recommence the well test. This case study was applied in a mature field, which is producing for 40 years that has good volume of historical well test data is available. This paper discusses the development of a data driven Well test data analyzer and Optimizer supported by artificial intelligence (AI) for wells being tested using MPFM in two staged approach. The motivating idea is to ingest historical, real-time data, well model performance curve and prescribe the quality of the well test data to provide flag to operator on real time. The ML prediction results helps testing operations and can reduce the test acceptance turnaround timing drastically from 10 days to hours. In Second layer, an unsupervised model with historical data is helping to identify the parameters that affecting for rejection of the well test example duration of testing, choke size, GOR etc. The outcome from the modeling will be incorporated in updating the well test procedure and testing Philosophy. This approach is being under evaluation stage in one of the asset in ADNOC Onshore. The results are expected to be reducing the well test rejection by at least 5 % that further optimize the resources required and improve the back allocation process. Furthermore, real time flagging of the test Quality will help in reduction of validation cycle from 10 days hours to improve the well testing cycle process. This methodology improves integrated reservoir management compliance of well testing requirements in asset where resources are limited. This methodology is envisioned to be integrated with full field digital oil field Implementation. This is a novel approach to apply machine learning and artificial intelligence application to well testing. It maximizes the utilization of real-time data for creating advisory system that improve test data quality monitoring and timely decision-making to reduce the well test rejection.


2021 ◽  
Author(s):  
Ali Al-Turki ◽  
Obai Alnajjar ◽  
Majdi Baddourah ◽  
Babatunde Moriwawon

Abstract The algorithms and workflows have been developed to couple efficient model parameterization with stochastic, global optimization using a Multi-Objective Genetic Algorithm (MOGA) for global history matching, and coupled with an advanced workflow for streamline sensitivity-based inversion for fine-tuning. During parameterization the low-rank subsets of most influencing reservoir parameters are identified and propagated to MOGA to perform the field-level history match. Data misfits between the field historical data and simulation data are calculated with multiple realizations of reservoir models that quantify and capture reservoir uncertainty. Each generation of the optimization algorithms reduces the data misfit relative to the previous iteration. This iterative process continues until a satisfactory field-level history match is reached or there are no further improvements. The fine-tuning process of well-connectivity calibration is then performed with a streamlined sensitivity-based inversion algorithm to locally update the model to reduce well-level mismatch. In this study, an application of the proposed algorithms and workflow is demonstrated for model calibration and history matching. The synthetic reservoir model used in this study is discretized into millions of grid cells with hundreds of producer and injector wells. It is designed to generate several decades of production and injection history to evaluate and demonstrate the workflow. In field-level history matching, reservoir rock properties (e.g., permeability, fault transmissibility, etc.) are parameterized to conduct the global match of pressure and production rates. Grid Connectivity Transform (GCT) was used and assessed to parameterize the reservoir properties. In addition, the convergence rate and history match quality of MOGA was assessed during the field (global) history matching. Also, the effectiveness of the streamline-based inversion was evaluated by quantifying the additional improvement in history matching quality per well. The developed parametrization and optimization algorithms and workflows revealed the unique features of each of the algorithms for model calibration and history matching. This integrated workflow has successfully defined and carried uncertainty throughout the history matching process. Following the successful field-level history match, the well-level history matching was conducted using streamline sensitivity-based inversion, which further improved the history match quality and conditioned the model to historical production and injection data. In general, the workflow results in enhanced history match quality in a shorter turnaround time. The geological realism of the model is retained for robust prediction and development planning.


Sign in / Sign up

Export Citation Format

Share Document