Regeneration of Initial Ensembles With Facies Analysis for Efficient History Matching

2017 ◽  
Vol 139 (4) ◽  
Author(s):  
Byeongcheol Kang ◽  
Jonggeun Choe

Reservoir characterization is needed for estimating reservoir properties and forecasting production rates in a reliable manner. However, it is challenging to figure out reservoir properties of interest due to limited information. Therefore, well-designed reservoir models, which reflect characteristics of a true field, should be selected and fine-tuned. We propose a novel scheme of generating initial reservoir models by using static data and production history data available. We select representative reservoir models by projecting reservoir models onto a two-dimensional (2D) plane using principal component analysis (PCA) and calculating errors of production rates against observed data. These selected models, which will have similar geological properties with the reference, are used to regenerate models by perturbing along the boundary of the different facies. These regenerated models have all the different facies distributions but share principal characteristics based on the selected models. We compare cases using 400 ensemble members, 100 models with unbiased uniform sampling, and 100 regenerated models by the proposed method. We analyze two synthetic reservoirs with different permeability distributions: one is a typical heterogeneous reservoir and the other is a channel reservoir with a bimodal permeability distribution. Compared to the cases using all the 400 models with ensemble Kalman filter (EnKF), the simulation time is dramatically reduced to 4.7%, while the prediction quality on oil and water productions is improved. Even in the more complex reservoir case, the proposed method shows great improvements with reduced uncertainties against the other cases.

Geophysics ◽  
1995 ◽  
Vol 60 (2) ◽  
pp. 354-364 ◽  
Author(s):  
Larry Lines ◽  
Henry Tan ◽  
Sven Treitel ◽  
John Beck ◽  
Richard Chambers ◽  
...  

In 1992, there was a collaborative effort in reservoir geophysics involving Amoco, Conoco, Schlumberger, and Stanford University in an attempt to delineate variations in reservoir properties of the Grayburg unit in a West Texas [Formula: see text] pilot at North Cowden Field. Our objective was to go beyond traveltime tomography in characterizing reservoir heterogeneity and flow anisotropy. This effort involved a comprehensive set of measurements to do traveltime tomography, to image reflectors, to analyze channel waves for reservoir continuity, to study shear‐wave splitting for borehole stress‐pattern estimation, and to do seismic anisotropy analysis. All these studies were combined with 3-D surface seismic data and with sonic log interpretation. The results are to be validated in the future with cores and engineering data by history matching of primary, water, and [Formula: see text] injection performance. The implementation of these procedures should provide critical information on reservoir heterogeneities and preferential flow direction. Geophysical methods generally indicated a continuous reservoir zone between wells.


2015 ◽  
Vol 138 (1) ◽  
Author(s):  
Jihoon Park ◽  
Jeongwoo Jin ◽  
Jonggeun Choe

For decision making, it is crucial to have proper reservoir characterization and uncertainty assessment of reservoir performances. Since initial models constructed with limited data have high uncertainty, it is essential to integrate both static and dynamic data for reliable future predictions. Uncertainty quantification is computationally demanding because it requires a lot of iterative forward simulations and optimizations in a single history matching, and multiple realizations of reservoir models should be computed. In this paper, a methodology is proposed to rapidly quantify uncertainties by combining streamline-based inversion and distance-based clustering. A distance between each reservoir model is defined as the norm of differences of generalized travel time (GTT) vectors. Then, reservoir models are grouped according to the distances and representative models are selected from each group. Inversions are performed on the representative models instead of using all models. We use generalized travel time inversion (GTTI) for the integration of dynamic data to overcome high nonlinearity and take advantage of computational efficiency. It is verified that the proposed method gathers models with both similar dynamic responses and permeability distribution. It also assesses the uncertainty of reservoir performances reliably, while reducing the amount of calculations significantly by using the representative models.


2021 ◽  
Author(s):  
Thomas J. Hampton ◽  
Mohamed El-Mandouh ◽  
Stevan Weber ◽  
Tirth Thaker ◽  
K.. Patel ◽  
...  

Abstract Mathematical Models are needed to aid in defining, analyzing, and quantifying solutions to design and manage steam floods. This paper discusses two main modeling methods – analytical and numerical simulation. Decisions as to which method to use and when to use them, requires an understanding of assumptions used, strengths, and limitations of each method. This paper presents advantages and disadvantages through comparison of analytical vs simulation when reservoir characterization becomes progressively more complex (dip, layering, heterogeneity between injector/producer, and reservoir thickness).While there are many analytical models, three analytical models are used for this paper:Marx & Langenheim, Modified Neuman, and Jeff Jones.The simulator used was CMG Stars on single pattern on both 5 Spot and 9 Spot patterns and Case 6 of 9 patterns, 5-Spot. Results were obtained using 6 different cases of varying reservoir properties based on Marx & Langenheim, Modified Neuman, and Jeff Jones models.Simulation was also done on each of the 6 cases, using Modified Neuman steam rates and then on Jeff Jones Steam rates using 9-Spot and 5-Spot patterns.This was done on predictive basis on inputs provided, without adjusting or history matching on analog or historical performance.Optimization runs using Particle Swarm Optimization was applied on one case in minimizing SOR and maximize NPV. Conclusion from comparing cases is that simulation is needed for complex geology, heterogeneity, and changes in layering. Also, simulation can be used for maximizing economics using AI based optimization tool. While understanding limitations, the analytical models are good for quick looks such as screening, scoping design, some surveillance, and for conceptual understanding of basic steam flood on uniform geologic properties. This paper is innovative in comparison of analytical models and simulation modeling.Results that quantify differences of oil rate, SOR, and injection rates (Neuman and Jeff Jones) impact on recovery factors is presented.


2020 ◽  
Author(s):  
Konrad Wojnar ◽  
Jon S?trom ◽  
Tore Felix Munck ◽  
Martha Stunell ◽  
Stig Sviland-Østre ◽  
...  

Abstract The aim of the study was to create an ensemble of equiprobable models that could be used for improving the reservoir management of the Vilje field. Qualitative and quantitative workflows were developed to systematically and efficiently screen, analyze and history match an ensemble of reservoir simulation models to production and 4D seismic data. The goal of developing the workflows is to increase the utilization of data from 4D seismic surveys for reservoir characterization. The qualitative and quantitative workflows are presented, describing their benefits and challenges. The data conditioning produced a set of history matched reservoir models which could be used in the field development decision making process. The proposed workflows allowed for identification of outlying prior and posterior models based on key features where observed data was not covered by the synthetic 4D seismic realizations. As a result, suggestions for a more robust parameterization of the ensemble were made to improve data coverage. The existing history matching workflow efficiently integrated with the quantitative 4D seismic history matching workflow allowing for the conditioning of the reservoir models to production and 4D data. Thus, the predictability of the models was improved. This paper proposes a systematic and efficient workflow using ensemble-based methods to simultaneously screen, analyze and history match production and 4D seismic data. The proposed workflow improves the usability of 4D seismic data for reservoir characterization, and in turn, for the reservoir management and the decision-making processes.


Reservoir modelling and production forecasting can provide vital inputs to the efficient management of petroleum. Since the reservoirs are highly heterogeneous and nonlinear in nature, it is often difficult to obtain accurate estimates of the spatial distribution of reservoir properties representing the reservoir and corresponding production profiles. If an accurate model of a reservoir is built, it can lead to efficient management of the reservoir. This paper describes the mathematical modelling of oil reservoirs along with various optimization techniques applicable for history matching and production forecasting. Gradient based and non-gradient based optimization techniques viz. Simulated Annealing (SA), Scatter Search (SS), Neighborhood algorithm (NA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Ensemble Kalman Filters (EnKF) and Genetic Algorithm (GA) and their application to reservoir production history matching and performance are presented. The recent advancements and variants of these techniques applied for the purpose are also presented.


2005 ◽  
Vol 8 (05) ◽  
pp. 426-436 ◽  
Author(s):  
Hao Cheng ◽  
Arun Kharghoria ◽  
Zhong He ◽  
Akhil Datta-Gupta

Summary We propose a novel approach to history matching finite-difference models that combines the advantages of streamline models with the versatility of finite-difference simulation. Current streamline models are limited in their ability to incorporate complex physical processes and cross-streamline mechanisms in a computationally efficient manner. A unique feature of streamline models is their ability to analytically compute the sensitivity of the production data with respect to reservoir parameters using a single flow simulation. These sensitivities define the relationship between changes in production response because of small changes in reservoir parameters and, thus, form the basis for many history-matching algorithms. In our approach, we use the streamline-derived sensitivities to facilitate history matching during finite-difference simulation. First, the velocity field from the finite-difference model is used to compute streamline trajectories, time of flight, and parameter sensitivities. The sensitivities are then used in an inversion algorithm to update the reservoir model during finite-difference simulation. The use of a finite-difference model allows us to account for detailed process physics and compressibility effects. Although the streamline-derived sensitivities are only approximate, they do not seem to noticeably impact the quality of the match or the efficiency of the approach. For history matching, we use a generalized travel-time inversion (GTTI) that is shown to be robust because of its quasilinear properties and that converges in only a few iterations. The approach is very fast and avoids many of the subjective judgments and time-consuming trial-and-error steps associated with manual history matching. We demonstrate the power and utility of our approach with a synthetic example and two field examples. The first one is from a CO2 pilot area in the Goldsmith San Andreas Unit (GSAU), a dolomite formation in west Texas with more than 20 years of waterflood production history. The second example is from a Middle Eastern reservoir and involves history matching a multimillion-cell geologic model with 16 injectors and 70 producers. The final model preserved all of the prior geologic constraints while matching 30 years of production history. Introduction Geological models derived from static data alone often fail to reproduce the field production history. Reconciling geologic models to the dynamic response of the reservoir is critical to building reliable reservoir models. Classical history-matching procedures whereby reservoir parameters are adjusted manually by trial and error can be tedious and often yield a reservoir description that may not be realistic or consistent with the geologic interpretation. In recent years, several techniques have been developed for integrating production data into reservoir models. Integration of dynamic data typically requires a least-squares-based minimization to match the observed and calculated production response. There are several approaches to such minimization, and these can be classified broadly into three categories: gradient-based methods, sensitivity-based methods, and derivative-free methods. The derivative-free approaches, such as simulated annealing or genetic algorithms, require numerous flow simulations and can be computationally prohibitive for field-scale applications. Gradient-based methods have been used widely for automatic history matching, although the convergence rates of these methods are typically slower than the sensitivity-based methods such as the Gauss-Newton or the LSQR method. An integral part of the sensitivity-based methods is the computation of sensitivity coefficients. These sensitivities are simply partial derivatives that define the change in production response because of small changes in reservoir parameters. There are several approaches to calculating sensitivity coefficients, and these generally fall into one of three categories: perturbation method, direct method, and adjoint-state methods. Conceptually, the perturbation approach is the simplest and requires the fewest changes in an existing code. Sensitivities are estimated simply by perturbing the model parameters one at a time by a small amount and then computing the corresponding production response. This approach requires (N+1) forward simulations, where N is the number of parameters. Obviously, it can be computationally prohibitive for reservoir models with many parameters. In the direct or sensitivity equation method, the flow and transport equations are differentiated to obtain expressions for the sensitivity coefficients. Because there is one equation for each parameter, this approach requires the same amount of work. A variation of this method, called the gradient simulator method, uses the discretized version of the flow equations and takes advantage of the fact that the coefficient matrix remains unchanged for all the parameters and needs to be decomposed only once. Thus, sensitivity computation for each parameter now requires a matrix/vector multiplication. This method can also be computationally expensive for a large number of parameters. Finally, the adjoint-state method requires derivation and solution of adjoint equations that can be quite cumbersome for multiphase-flow applications. Furthermore, the number of adjoint solutions will generally depend on the amount of production data and, thus, the length of the production history.


SPE Journal ◽  
2016 ◽  
Vol 21 (04) ◽  
pp. 1413-1424 ◽  
Author(s):  
Yuqing Chang ◽  
Andreas S. Stordal ◽  
Randi Valestrand

Summary Data assimilation with ensemble-based inversion methods was successfully applied for parameter estimation in reservoir models. However, in certain complex-reservoir models, it remains challenging to estimate the model parameters and to preserve the geological realism simultaneously. In particular, when handling special-reservoir model parameters such as facies types concerning fluvial channels, one must realize that geological realism becomes one of the key concerns. The main objective of this work is to address these issues for a complex field with a newly extended version of a recently proposed facies-parameterization approach coupled with an ensemble-based data assimilation method. The proposed workflow combines the new facies parameterization and the adaptive gaussian mixture (AGM) filter into the data assimilation framework for channelized reservoirs. To handle discrete-facies parameters, we combine probability maps and truncated Gaussian fields to obtain a continuous parameterization of the facies fields. For the data assimilation, we use the AGM filter, which is an efficient history matching approach that incorporates a resampling routine that allows us to regenerate facies fields with information from the updated probability maps. This work flow is evaluated, for the first time, on a complex field case—the Brugge field. This reservoir model consists of layers with complex channelized structures and layers characterized by reservoir properties generated with variograms. With limited prior knowledge on the facies model, this work flow is shown to be able to preserve the channel continuity while reducing the reservoir model uncertainty with AGM. When applied to a complex reservoir, the proposed work flow provides a geologically consistent and realistic reservoir model that leads to improved capability of predicting subsurface flow behaviors.


SPE Journal ◽  
2007 ◽  
Vol 12 (03) ◽  
pp. 382-391 ◽  
Author(s):  
Mohammad Zafari ◽  
Albert Coburn Reynolds

Summary Recently, the ensemble Kalman Filter (EnKF) has gained popularity in atmospheric science for the assimilation of data and the assessment of uncertainty in forecasts for complex, large-scale problems. A handful of papers have discussed reservoir characterization applications of the EnKF, which can easily and quickly be coupled with any reservoir simulator. Neither adjoint code nor specific knowledge of simulator numerics is required for implementation of the EnKF. Moreover, data are assimilated (matched) as they become available; a suite of plausible reservoir models (the ensemble, set of ensemble members or suite or realizations) is continuously updated to honor data without rematching data assimilated previously. Because of these features, the method is far more efficient for history matching dynamic data than automatic history matching based on optimization algorithms. Moreover, the set of realizations provides a way to evaluate the uncertainty in reservoir description and performance predictions. Here we establish a firm theoretical relation between randomized maximum likelihood and the ensemble Kalman filter. Although we have previously generated reservoir characterization examples where the method worked well, here we also provide examples where the performance of EnKF does not provide a reliable characterization of uncertainty. Introduction Our main interest is in characterizing the uncertainty in reservoir description and reservoir performance predictions in order to optimize reservoir management. To do so, we wish to generate a suite of plausible reservoir models (realizations) that are consistent with all information and data. If the set of models is obtained by correctly sampling the pdf, then the set of models give a characterization of the uncertainty in the reservoir model. Thus, by predicting future reservoir performance with each of the realizations, and calculating statistics on the set of outcomes, one can evaluate the uncertainty in reservoir performance predictions.


Sign in / Sign up

Export Citation Format

Share Document