Uncertainty Space Expansion: A Consistent Integration of Measurement Errors in History Matching

2017 ◽  
Author(s):  
P. Likanapaisal
SPE Journal ◽  
2020 ◽  
Vol 25 (06) ◽  
pp. 3317-3331
Author(s):  
Pipat Likanapaisal ◽  
Hamdi A. Tchelepi

Summary In general, a probabilistic framework for a modeling process involves two uncertainty spaces: model parameters and state variables (or predictions). The two uncertainty spaces in reservoir simulation are connected by the governing equations of flow and transport in porous media in the form of a reservoir simulator. In a forward problem (or a predictive run), the reservoir simulator directly maps the uncertainty space of the model parameters to the uncertainty space of the state variables. Conversely, an inverse problem (or history matching) aims to improve the descriptions of the model parameters by using the measurements of state variables. However, we cannot solve the inverse problem directly in practice. Numerous algorithms, including Kriging-based inversion and the ensemble Kalman filter (EnKF) and its many variants, simplify the system by using a linear assumption. The purpose of this paper is to improve the integration of measurement errors in the history-matching algorithms that rely on the linear assumption. The statistical moment equation (SME) approach with the Kriging-based inversion algorithm is used to illustrate several practical examples. In the Motivation section, an example of pressure conditioning has a measurement that contains no additional information because of its significant measurement error. This example highlights the inadequacy of the current method that underestimates the conditional uncertainty for both model parameters and predictions. Accordingly, we derive a new formula that recognizes the absence of additional information and preserves the unconditional uncertainty. We believe this to be the consistent behavior to integrate measurement errors. Other examples are used to validate the new formula with both linear and nonlinear (i.e., the saturation equation) problems, with single and multiple measurements, and with different configurations of measurement errors. For broader applications, we also develop an equivalent formula for algorithms in the Monte Carlo simulation (MCS) approach, such as EnKF and ensemble smoother (ES).


Author(s):  
Geir Evensen

AbstractIt is common to formulate the history-matching problem using Bayes’ theorem. From Bayes’, the conditional probability density function (pdf) of the uncertain model parameters is proportional to the prior pdf of the model parameters, multiplied by the likelihood of the measurements. The static model parameters are random variables characterizing the reservoir model while the observations include, e.g., historical rates of oil, gas, and water produced from the wells. The reservoir prediction model is assumed perfect, and there are no errors besides those in the static parameters. However, this formulation is flawed. The historical rate data only approximately represent the real production of the reservoir and contain errors. History-matching methods usually take these errors into account in the conditioning but neglect them when forcing the simulation model by the observed rates during the historical integration. Thus, the model prediction depends on some of the same data used in the conditioning. The paper presents a formulation of Bayes’ theorem that considers the data dependency of the simulation model. In the new formulation, one must update both the poorly known model parameters and the rate-data errors. The result is an improved posterior ensemble of prediction models that better cover the observations with more substantial and realistic uncertainty. The implementation accounts correctly for correlated measurement errors and demonstrates the critical role of these correlations in reducing the update’s magnitude. The paper also shows the consistency of the subspace inversion scheme by Evensen (Ocean Dyn. 54, 539–560 2004) in the case with correlated measurement errors and demonstrates its accuracy when using a “larger” ensemble of perturbations to represent the measurement error covariance matrix.


Mathematics ◽  
2021 ◽  
Vol 9 (24) ◽  
pp. 3289
Author(s):  
Emil N. Musakaev ◽  
Sergey P. Rodionov ◽  
Nail G. Musakaev

A three-dimensional numerical hydrodynamic model fairly accurately describes the processes of developing oil and gas fields, and has good predictive properties only if there are high-quality input data and comprehensive information about the reservoir. However, under conditions of high uncertainty of the input data, measurement errors, significant time and resource costs for processing and analyzing large amounts of data, the use of such models may be unreasonable and can lead to ill-posed problems: either the uniqueness of the solution or its stability is violated. A well-known method for dealing with these problems is regularization or the method of adding some additional a priori information. In contrast to full-scale modeling, currently there is active development of reduced-physics models, which are used, first of all, in conditions when it is required to make an operational decision, and computational resources are limited. One of the most popular simplified models is the material balance model, which makes it possible to directly capture the relationship between reservoir pressure, flow rates and the integral reservoir characteristics. In this paper, it is proposed to consider a hierarchical approach when solving the problem of oil field waterflooding control using material balance models in successive approximations: first for the field as a whole, then for hydrodynamically connected blocks of the field, then for wells. When moving from one level of model detailing to the next, the modeling results from the previous levels of the hierarchy are used in the form of additional regularizing information, which ultimately makes it possible to correctly solve the history matching problem (identification of the filtration model) in conditions of incomplete input information.


2019 ◽  
Vol 141 (7) ◽  
Author(s):  
Sungil Kim ◽  
Hyungsik Jung ◽  
Jonggeun Choe

Reservoir characterization is a process to make dependable reservoir models using available reservoir information. There are promising ensemble-based methods such as ensemble Kalman filter (EnKF), ensemble smoother (ES), and ensemble smoother with multiple data assimilation (ES-MDA). ES-MDA is an iterative version of ES with inflated covariance matrix of measurement errors. It provides efficient and consistent global updates compared to EnKF and ES. Ensemble-based method might not work properly for channel reservoirs because its parameters are highly non-Gaussian. Thus, various parameterization methods are suggested in previous studies to handle nonlinear and non-Gaussian parameters. Discrete cosine transform (DCT) can figure out essential channel information, whereas level set method (LSM) has advantages on detailed channel border analysis in grid scale transforming parameters into Gaussianity. However, DCT and LSM have weaknesses when they are applied separately on channel reservoirs. Therefore, we propose a properly designed combination algorithm using DCT and LSM in ES-MDA. When DCT and LSM agree with each other on facies update results, a grid has relevant facies naturally. If not, facies is assigned depending on the average facies probability map from DCT and LSM. By doing so, they work in supplementary way preventing from wrong or biased decision on facies. Consequently, the proposed method presents not only stable channel properties such as connectivity and continuity but also similar pattern with the true. It also gives trustworthy future predictions of gas and water productions due to well-matched facies distribution according to the reference.


2021 ◽  
Author(s):  
Usman Aslam ◽  
Jorge Burgos ◽  
Craig Williams ◽  
Shawn McCloskey ◽  
James Cooper ◽  
...  

Abstract Reservoir production forecasts are inherently uncertain due to the lack of quality data available to build predictive reservoir models. Multiple data types, including historical production, well tests (RFT/PLT), and time-lapse seismic data, are assimilated into reservoir models during the history matching process to improve predictability of the model. Traditionally, a ‘best estimate’ for relative permeability data is assumed during the history matching process, despite there being significant uncertainty in the relative permeability. Relative permeability governs multiphase flow in the reservoir; therefore, it has significant importance in understanding the reservoir behavior as well as for model calibration and hence for reliable production forecasts. Performing sensitivities around the ‘best estimate’ relative permeability case will cover only part of the uncertainty space, with no indication of the confidence that may be placed on these forecasts. In this paper, we present an application of a Bayesian framework for uncertainty assessment and efficient history matching of a Permian CO2 EOR field for reliable production forecast. The study field has complex geology with over 65 years of historical data from primary recovery, waterflood, and CO2 injection. Relative permeability data from the field showed significant uncertainty, so we used uncertainties in the saturation endpoints as well as in the curvature of the relative permeability in multiple zones, by employing generalized Corey functions for relative permeability parameterization. Uncertainty in the relative permeability is used through a common platform integrator. An automated workflow generates the first set of relative permeability curves sampled from the prior distribution of saturation endpoints and Corey exponents, called ‘scoping runs’. These relative permeability curves are then passed to the reservoir simulator. The assumptions of uncertainties in the relative permeability data and other dynamic parameters are quickly validated by comparing the scoping runs and historical observations. By creating a mismatch or likelihood function, the Bayesian framework generates an ensemble of history matched models calibrated to the production data which can then be used for reliable probabilistic forecasting. Several iterations during the manual history match did not yield an acceptable solution, as uncertainty in the relative permeability was ignored. An application of the Bayesian inference accelerated by a proxy model found the relative permeability data to be one of the most influential parameters during the assisted history matching exercise. Incorporating the uncertainty in relative permeability data along with other dynamic parameters not only helped speed up the model calibration process, but also led to the identification of multiple history matched models. In addition, results show that the use of the Bayesian framework significantly reduced uncertainty in the most important dynamic parameters. The proposed approach allows incorporating previously ignored uncertainty in the relative permeability data in a systematic manner. The user-defined mismatch function increases the likelihood of obtaining an acceptable match and the weights in the mismatch function allow both the measurement uncertainty and the effect of simulation model inaccuracies. The Bayesian framework considers the whole uncertainty space and not just the history match region, leading to the identification of multiple history matched models.


1977 ◽  
Vol 17 (01) ◽  
pp. 42-56 ◽  
Author(s):  
A.H. Dogru ◽  
T.N. Dixon ◽  
T.F. Edgar

Abstract Methods of nonlinear regression theory were applied to the reservoir history-matching problem to determine the effect of erroneous problem to determine the effect of erroneous parameter estimates obtained from well testing parameter estimates obtained from well testing on the future prediction of reservoir pressures. Two examples were studied: well testing in a radial one-dimensional slightly compressible reservoir and in an undersaturated, two-dimensional, heterogeneous oil field. The reservoir parameters of permeability, porosity, external radius, and pore volume were considered, and the effects of pore volume were considered, and the effects of measurement error, test time, and flow rate on the confidence limits were computed. Introduction The operation of a reservoir simulator requires accurate estimates of the reservoir properties. However, the simulation parameters, such as permeability, porosity, and reservoir geometry, are permeability, porosity, and reservoir geometry, are usually unknown unless coring and physical property analysis have been undertaken. Because of the cost of these procedures, it is more desirable to use the pressures measured at the well during a well test pressures measured at the well during a well test and indirectly compute the important parameters of the system. By using history matching of the test data to obtain the system parameters, the future pressure behavior of the reservoir can be predicted pressure behavior of the reservoir can be predictedSeveral studies on history matching have indicated that the welltest approach for determining the reservoir parameters often suffers from incorrect and nonunique parameter estimates. The factors that affect the parameter estimation can be classified as model errors, observability, measurement errors or noise, history time, test procedure, and optimization procedure. Model errors arise from the inaccuracy of the model and the numerical integration. For example, a reservoir simulator is only a reasonable approximation for flow through porous media. Solution of a model equation by numerical means also introduces roundoff and discretization errors. Observability of the system plays an important role in estimating the reservoir parameters. Depending on the location of the well and the number of data points, it may not be possible to determine uniquely all reservoir parameters from the measurements made at that well. Observability is strictly a function of the reservoir model used. At a given well, pressure measurements may only reflect the values of the parameters in specific zones of the reservoir. If a specific zone away from the well does not affect the measured pressure, then the system is not observable at that particular location. A rigorous definition of observability can be found in other papers. Measurement errors in the pressures and flow rates are another source of unrealistic parameter estimates. Longer history times always give more information about the reservoir as long as the system remains in a dynamic state. The nature of the system input (well flow rate) also affects the accuracy of the estimates and predictions. The final source of incorrect parameter estimates arises because the history-matching problem, posed mathematically, is usually a nonlinear programming problem that must be solved computationally. Such problem that must be solved computationally. Such a problem yields multiple extrema that often can lead to a relative minimum (rather than a global minimum) in the numerical search for the smallest matching error. Also, the magnitude of the objective function can be quite insensitive to the parameters selected, thus causing the optimization procedure to terminate prematurely. The above factors control the history-matching process; with actual data, it is usually impossible process; with actual data, it is usually impossible to identify the exact contributions of each factor to the errors in the parameter estimates. Since a certain amount of error will be introduced into the estimated parameters from the history-matching process, it is parameters from the history-matching process, it is useful to study the magnitude of this error resulting from various sources under controlled simulation conditions. Also, it is important to determine how the errors in the parameters are reflected in the future predictions of the pressures. SPEJ P. 42


SPE Journal ◽  
2012 ◽  
Vol 18 (01) ◽  
pp. 159-171 ◽  
Author(s):  
Mario Trani ◽  
Rob Arts ◽  
Olwijn Leeuwenburgh

Summary Time-lapse seismic data provide information on the dynamics of multiphase reservoir fluid flow in places where no production data from wells are available. This information, in principle, could be used to estimate unknown reservoir properties. However, the amount, resolution, and character of the data have long posed significant challenges for quantitative use in assisted-history-matching workflows. Previous studies, therefore, have generally investigated methods for updating single models with reduced parameter-uncertainty space. Recent developments in ensemble-based history-matching methods have shown the feasibility of multimodel history and matching of production data while maintaining a full uncertainty description. Here, we introduce a robust and flexible reparameterization for interpreted fluid fronts or seismic attribute isolines that extends these developments to seismic history matching. The seismic data set is reparameterized, in terms of arrival times, at observed front positions, thereby significantly reducing the number of data while retaining essential information. A simple 1D example is used to introduce the concepts of the approach. A synthetic 3D example, with spatial complexity that is typical for many waterfloods, is examined in detail. History-matching cases based on both separate and combined use of production and seismic data are examined. It is shown that consistent multimodel history matches can be obtained without the need for reduction of the parameter space or for localization of the impact of observations. The quality of forecasts based on the history-matched models is evaluated by simulating both expected production and saturation changes throughout the field for a fixed operating strategy. It is shown that bias and uncertainty in the forecasts of production both at existing wells and in the flooded area are reduced considerably when both production and seismic data are incorporated. The proposed workflow, therefore, enables better decisions on field developments that require optimal placement of infill wells.


Author(s):  
W.J. de Ruijter ◽  
Sharma Renu

Established methods for measurement of lattice spacings and angles of crystalline materials include x-ray diffraction, microdiffraction and HREM imaging. Structural information from HREM images is normally obtained off-line with the traveling table microscope or by the optical diffractogram technique. We present a new method for precise measurement of lattice vectors from HREM images using an on-line computer connected to the electron microscope. It has already been established that an image of crystalline material can be represented by a finite number of sinusoids. The amplitude and the phase of these sinusoids are affected by the microscope transfer characteristics, which are strongly influenced by the settings of defocus, astigmatism and beam alignment. However, the frequency of each sinusoid is solely a function of overall magnification and periodicities present in the specimen. After proper calibration of the overall magnification, lattice vectors can be measured unambiguously from HREM images.Measurement of lattice vectors is a statistical parameter estimation problem which is similar to amplitude, phase and frequency estimation of sinusoids in 1-dimensional signals as encountered, for example, in radar, sonar and telecommunications. It is important to properly model the observations, the systematic errors and the non-systematic errors. The observations are modelled as a sum of (2-dimensional) sinusoids. In the present study the components of the frequency vector of the sinusoids are the only parameters of interest. Non-systematic errors in recorded electron images are described as white Gaussian noise. The most important systematic error is geometric distortion. Lattice vectors are measured using a two step procedure. First a coarse search is obtained using a Fast Fourier Transform on an image section of interest. Prior to Fourier transformation the image section is multiplied with a window, which gradually falls off to zero at the edges. The user indicates interactively the periodicities of interest by selecting spots in the digital diffractogram. A fine search for each selected frequency is implemented using a bilinear interpolation, which is dependent on the window function. It is possible to refine the estimation even further using a non-linear least squares estimation. The first two steps provide the proper starting values for the numerical minimization (e.g. Gauss-Newton). This third step increases the precision with 30% to the highest theoretically attainable (Cramer and Rao Lower Bound). In the present studies we use a Gatan 622 TV camera attached to the JEM 4000EX electron microscope. Image analysis is implemented on a Micro VAX II computer equipped with a powerful array processor and real time image processing hardware. The typical precision, as defined by the standard deviation of the distribution of measurement errors, is found to be <0.003Å measured on single crystal silicon and <0.02Å measured on small (10-30Å) specimen areas. These values are ×10 times larger than predicted by theory. Furthermore, the measured precision is observed to be independent on signal-to-noise ratio (determined by the number of averaged TV frames). Obviously, the precision is restricted by geometric distortion mainly caused by the TV camera. For this reason, we are replacing the Gatan 622 TV camera with a modern high-grade CCD-based camera system. Such a system not only has negligible geometric distortion, but also high dynamic range (>10,000) and high resolution (1024x1024 pixels). The geometric distortion of the projector lenses can be measured, and corrected through re-sampling of the digitized image.


Sign in / Sign up

Export Citation Format

Share Document