The Tengiz Field History Matching Problem Revisited

Author(s):  
Guohua Gao ◽  
Mohammad Zafari ◽  
A.C. Reynolds
Author(s):  
Geir Evensen

AbstractIt is common to formulate the history-matching problem using Bayes’ theorem. From Bayes’, the conditional probability density function (pdf) of the uncertain model parameters is proportional to the prior pdf of the model parameters, multiplied by the likelihood of the measurements. The static model parameters are random variables characterizing the reservoir model while the observations include, e.g., historical rates of oil, gas, and water produced from the wells. The reservoir prediction model is assumed perfect, and there are no errors besides those in the static parameters. However, this formulation is flawed. The historical rate data only approximately represent the real production of the reservoir and contain errors. History-matching methods usually take these errors into account in the conditioning but neglect them when forcing the simulation model by the observed rates during the historical integration. Thus, the model prediction depends on some of the same data used in the conditioning. The paper presents a formulation of Bayes’ theorem that considers the data dependency of the simulation model. In the new formulation, one must update both the poorly known model parameters and the rate-data errors. The result is an improved posterior ensemble of prediction models that better cover the observations with more substantial and realistic uncertainty. The implementation accounts correctly for correlated measurement errors and demonstrates the critical role of these correlations in reducing the update’s magnitude. The paper also shows the consistency of the subspace inversion scheme by Evensen (Ocean Dyn. 54, 539–560 2004) in the case with correlated measurement errors and demonstrates its accuracy when using a “larger” ensemble of perturbations to represent the measurement error covariance matrix.


2006 ◽  
Author(s):  
Sergio Henrique Guerra Sousa ◽  
Celio Maschio ◽  
Denis Jose Schiozer

1984 ◽  
Vol 24 (06) ◽  
pp. 697-706 ◽  
Author(s):  
A.T. Watson ◽  
G.R. Gavalas ◽  
J.H. Seinfeld

Abstract Since the number of parameters to be estimated in a reservoir history match is potentially quite large, it is important to determine which parameters can be estimated with reasonable accuracy from the available data. This aspect can be called determining the identifiability of the parameters. The identifiability of porosity and absolute parameters. The identifiability of porosity and absolute and relative permeabilities on the basis of flow and pressure data in a two-phase (oil/water) reservoir is pressure data in a two-phase (oil/water) reservoir is considered. The question posed is: How accurately can one expect to estimate spatially variable porosity and absolute permeability and relative permeabilities given typical permeability and relative permeabilities given typical production and pressure data" To gain insight into this production and pressure data" To gain insight into this question, analytical solutions for pressure and saturation in a one-dimensional (1D) waterflood are used. The following, conclusions are obtained.Only the average value of the porosity can be determined on the basis of water/oil flow measurements.The permeability distribution can be determined from pressure drop data with an accuracy depending on the pressure drop data with an accuracy depending on the mobility ratio.Exponents in a power function representation of the relative permeabilities can he determined from WOR data alone but not nearly so accurately as when pressure drop and flow data are used simultaneously. Introduction The utility of reservoir simulation in predicting reservoir behavior is limited by the accuracy with which reservoir properties can be estimated. Because of the high costs properties can be estimated. Because of the high costs associated with coring analysis, reservoir engineers must rely, on history matching as a means of estimating reservoir properties. In this process a history match is carried out by choosing the reservoir properties as those that result in simulated well pressure and flow data that match as closely as possible those measured during production. In general, reservoir properties at each gridblock in the simulator represent the unknown values to be determined. Although there are efficient methods for estimating such a large number of unknowns, it has long been recognized from the results of single phase history matching exercises that many different sets of parameter values may yield a nearly identical match of observed and predicted pressures. The conventional single phase predicted pressures. The conventional single phase history matching problem is in fact a mathematically illposed problem, which explains its nonunique behavior. Such a situation is, in short, the result of the large number of unknowns to be estimated on the basis of the available data and the lack of sensitivity of the simulator solutions to the parameters. Because of this lack of sensitivity, the need to reduce the number of unknown Parameters or to introduce some additional constraints, such as "smoothness" of the estimated parameters, has been recognized. A problem as important as that of choosing which minimization method to employ in history matching is that of choosing, on the basis of the available well data. which properties actually should be estimated. This selection properties actually should be estimated. This selection depends on the relationship of the unknown parameters to the simulated well data. Ideally one would want to knowwhich parameters can be determined uniquely if the measurements were exact, andgiven the expected level of error in the measurements, how accurately we can expect to be able to estimate the parameters. The first question, that of establishing uniqueness of the estimated parameters, is notoriously difficult to answer, and for a parameters, is notoriously difficult to answer, and for a problem as complicated as reservoir history matching, problem as complicated as reservoir history matching, there are virtually no general results available that allow one to establish uniqueness for permeability or porosity. Thus, it is not possible in general to base our choice of which parameters to estimate on rigorous mathematical uniqueness results. In lieu of an answer to Question 1, the selection of parameters to be estimated can be based on Question 2, parameters to be estimated can be based on Question 2, which is amenable to theoretical analysis. If the expected errors in estimation of any of the parameters, or any linear combination of the parameters, are extremely large, then that parameter or set of parameters can be judged as not identifiable. In such a case, steps may be taken to reduce the number of unknown parameters. In summary, the reservoir history matching problem is a difficult parameter estimation problem, and understanding the relationship between the unknown parameters and the measured data is essential to obtaining meaningful estimates of the reservoir properties. Quantitative studies regarding the accuracy of estimates for single-phase history matching problems have been reported by Shah et al. and Dogru et al. Shah et al,. investigated the optimal level of zonation for use with 1D single-phase (oil) situations. SPEJ P. 697


Mathematics ◽  
2021 ◽  
Vol 9 (24) ◽  
pp. 3289
Author(s):  
Emil N. Musakaev ◽  
Sergey P. Rodionov ◽  
Nail G. Musakaev

A three-dimensional numerical hydrodynamic model fairly accurately describes the processes of developing oil and gas fields, and has good predictive properties only if there are high-quality input data and comprehensive information about the reservoir. However, under conditions of high uncertainty of the input data, measurement errors, significant time and resource costs for processing and analyzing large amounts of data, the use of such models may be unreasonable and can lead to ill-posed problems: either the uniqueness of the solution or its stability is violated. A well-known method for dealing with these problems is regularization or the method of adding some additional a priori information. In contrast to full-scale modeling, currently there is active development of reduced-physics models, which are used, first of all, in conditions when it is required to make an operational decision, and computational resources are limited. One of the most popular simplified models is the material balance model, which makes it possible to directly capture the relationship between reservoir pressure, flow rates and the integral reservoir characteristics. In this paper, it is proposed to consider a hierarchical approach when solving the problem of oil field waterflooding control using material balance models in successive approximations: first for the field as a whole, then for hydrodynamically connected blocks of the field, then for wells. When moving from one level of model detailing to the next, the modeling results from the previous levels of the hierarchy are used in the form of additional regularizing information, which ultimately makes it possible to correctly solve the history matching problem (identification of the filtration model) in conditions of incomplete input information.


Author(s):  
Y. Melnikova ◽  
A. Zunino ◽  
K. Lange ◽  
K.S. Cordua ◽  
K. Mosegaard

2014 ◽  
Author(s):  
Baurzhan Kassenov ◽  
Gregory R. King ◽  
Moon Chaudhri ◽  
Aizada Abdrakhmanova ◽  
Steve Jenkins ◽  
...  

SPE Journal ◽  
2015 ◽  
Vol 20 (05) ◽  
pp. 962-982 ◽  
Author(s):  
Xiaodong Luo ◽  
Andreas S. Stordal ◽  
Rolf J. Lorentzen ◽  
Geir Nævdal

Summary The focus of this work is on an alternative implementation of the iterative-ensemble smoother (iES). We show that iteration formulae similar to those used by Chen and Oliver (2013) and Emerick and Reynolds (2012) can be derived by adopting a regularized Levenberg-Marquardt (RLM) algorithm (Jin 2010) to approximately solve a minimum-average-cost (MAC) problem. This not only leads to an alternative theoretical tool in understanding and analyzing the behavior of the aforementioned iES, but also provides insights and guidelines for further developments of the smoothing algorithms. For illustration, we compare the performance of an implementation of the RLM-MAC algorithm with that of the approximate iES used by Chen and Oliver (2013) in three numerical examples: an initial condition estimation problem in a strongly nonlinear system, a facies estimation problem in a 2D reservoir, and the history-matching problem in the Brugge field case. In these three specific cases, the RLM-MAC algorithm exhibits comparable or better performance, especially in the strongly nonlinear system.


Geophysics ◽  
2012 ◽  
Vol 77 (1) ◽  
pp. M1-M16 ◽  
Author(s):  
Juan Luis Fernández Martínez ◽  
Tapan Mukerji ◽  
Esperanza García Gonzalo ◽  
Amit Suman

History matching provides to reservoir engineers an improved spatial distribution of physical properties to be used in forecasting the reservoir response for field management. The ill-posed character of the history-matching problem yields nonuniqueness and numerical instabilities that increase with the reservoir complexity. These features might cause local optimization methods to provide unpredictable results not being able to discriminate among the multiple models that fit the observed data (production history). Also, the high dimensionality of the inverse problem impedes estimation of uncertainties using classical Markov-chain Monte Carlo methods. We attenuated the ill-conditioned character of this history-matching inverse problem by reducing the model complexity using a spatial principal component basis and by combining as observables flow production measurements and time-lapse seismic crosswell tomographic images. Additionally the inverse problem was solved in a stochastic framework. For this purpose, we used a family of particle swarm optimization (PSO) optimizers that have been deduced from a physical analogy of the swarm system. For a synthetic sand-and-shale reservoir, we analyzed the performance of the different PSO optimizers, both in terms of exploration and convergence rate for two different reservoir models with different complexity and under the presence of different levels of white Gaussian noise added to the synthetic observed data. We demonstrated that PSO optimizers have a very good convergence rate for this example, and provide in addition, approximate measures of uncertainty around the optimum facies model. The PSO algorithms are robust in presence of noise, which is always the case for real data.


Sign in / Sign up

Export Citation Format

Share Document