Reservoir History Matching by Bayesian Estimation

1976 ◽  
Vol 16 (06) ◽  
pp. 337-350 ◽  
Author(s):  
G.R. Gavalas ◽  
P.C. Shah ◽  
J.H. Seinfeld

Abstract The estimation of reservoir properties is inherently an underdetermined problem (one having a nonunique solution) because of the large number of unknown parameters relative to the available data. parameters relative to the available data. The common zonation approach to reducing the number of parameters introduces considerable modeling error by insisting that reservoir properties are uniform within each zone and by assigning the boundaries of these zones more or less arbitrarily. In this paper, Bayesian estimation theory is applied to history matching as an alternative to zonation. By using a priori statistical information on the unknown parameters, the problem becomes statistically better determined. Bayesian estimation and zonation are applied to the problem of porosity and permeability estimation in a one-dimensional, permeability estimation in a one-dimensional, one-phase reservoir. Introduction The estimation of parameters such as porosity and permeability in a reservoir model using well production and pressure data is commonly referred production and pressure data is commonly referred to as history matching. Although an inhomogeneous reservoir is in principle specified by an infinite number of parameters, a computational reservoir model can only contain a finite number. The most detailed description is obtained by allowing porosity and permeability to vary independently at each block of the spatial grid used in the finite-difference solution. While minimizing the modeling error, this approach entails a great deal of uncertainty because of the large number of unknowns compared with the limited data available. Thus, in a given problem, many different sets of property estimates problem, many different sets of property estimates may provide satisfactory and essentially indistinquishable data fits. Some of these parameter estimates can be grossly in error with respect to the actual properties, and as a result can lead to erroneous prediction of future reservoir behavior. To reduce the statistical uncertainty one must either decrease the number of unknowns or utilize additional information. A commonly used procedure for reducing the number of unknown parameters is zonation; the reservoir is divided into a small number of zones, in each of which the properties are treated as uniform. A modeling error is thus introduced through the assumption of uniform properties within each zone and through the more or less arbitrary assignment of the zone boundaries. As the number of zones is decreased, the error due to statistical uncertainty decreases while the modeling error increases. The total error passes through a minimum at some intermediate number of zones. The specification of this optimum level of description, which has been briefly considered in past work, will be treated in detail in a future report. An alternative to decreasing the statistical uncertainty by reducing the number of unknown parameters is the utilization of additional parameters is the utilization of additional information. This information need not be limited to measurements on the reservoir under study, but can be based on prior geological information about property variability in reservoirs of the same type. property variability in reservoirs of the same type. This paper examines this alternative method of reducing statistical uncertainty. The prior geological information is utilized by a formulation akin to classical Bayesian estimation. The Bayesian estimation is illustrated and is compared with the zonation approach for the case of a hypothetical, one-dimensional reservoir with variable porosity and permeability. The numerical simulations are used to investigate questions such as the optimum number of parameters in zonation and the effect of erroneous prior statistics in Bayesian estimation, and to compare the two methods. Considerable attention is also given to computational aspects such as convergence rate and computer time required by two of the most commonly used minimization algorithms, Marquardt's and the conjugate gradient. NATURE OF PRIOR GEOLOGICAL INFORMATION The application of probabilistic models in geology is the subject of a recent review. SPEJ P. 337

1984 ◽  
Vol 24 (06) ◽  
pp. 697-706 ◽  
Author(s):  
A.T. Watson ◽  
G.R. Gavalas ◽  
J.H. Seinfeld

Abstract Since the number of parameters to be estimated in a reservoir history match is potentially quite large, it is important to determine which parameters can be estimated with reasonable accuracy from the available data. This aspect can be called determining the identifiability of the parameters. The identifiability of porosity and absolute parameters. The identifiability of porosity and absolute and relative permeabilities on the basis of flow and pressure data in a two-phase (oil/water) reservoir is pressure data in a two-phase (oil/water) reservoir is considered. The question posed is: How accurately can one expect to estimate spatially variable porosity and absolute permeability and relative permeabilities given typical permeability and relative permeabilities given typical production and pressure data" To gain insight into this production and pressure data" To gain insight into this question, analytical solutions for pressure and saturation in a one-dimensional (1D) waterflood are used. The following, conclusions are obtained.Only the average value of the porosity can be determined on the basis of water/oil flow measurements.The permeability distribution can be determined from pressure drop data with an accuracy depending on the pressure drop data with an accuracy depending on the mobility ratio.Exponents in a power function representation of the relative permeabilities can he determined from WOR data alone but not nearly so accurately as when pressure drop and flow data are used simultaneously. Introduction The utility of reservoir simulation in predicting reservoir behavior is limited by the accuracy with which reservoir properties can be estimated. Because of the high costs properties can be estimated. Because of the high costs associated with coring analysis, reservoir engineers must rely, on history matching as a means of estimating reservoir properties. In this process a history match is carried out by choosing the reservoir properties as those that result in simulated well pressure and flow data that match as closely as possible those measured during production. In general, reservoir properties at each gridblock in the simulator represent the unknown values to be determined. Although there are efficient methods for estimating such a large number of unknowns, it has long been recognized from the results of single phase history matching exercises that many different sets of parameter values may yield a nearly identical match of observed and predicted pressures. The conventional single phase predicted pressures. The conventional single phase history matching problem is in fact a mathematically illposed problem, which explains its nonunique behavior. Such a situation is, in short, the result of the large number of unknowns to be estimated on the basis of the available data and the lack of sensitivity of the simulator solutions to the parameters. Because of this lack of sensitivity, the need to reduce the number of unknown Parameters or to introduce some additional constraints, such as "smoothness" of the estimated parameters, has been recognized. A problem as important as that of choosing which minimization method to employ in history matching is that of choosing, on the basis of the available well data. which properties actually should be estimated. This selection properties actually should be estimated. This selection depends on the relationship of the unknown parameters to the simulated well data. Ideally one would want to knowwhich parameters can be determined uniquely if the measurements were exact, andgiven the expected level of error in the measurements, how accurately we can expect to be able to estimate the parameters. The first question, that of establishing uniqueness of the estimated parameters, is notoriously difficult to answer, and for a parameters, is notoriously difficult to answer, and for a problem as complicated as reservoir history matching, problem as complicated as reservoir history matching, there are virtually no general results available that allow one to establish uniqueness for permeability or porosity. Thus, it is not possible in general to base our choice of which parameters to estimate on rigorous mathematical uniqueness results. In lieu of an answer to Question 1, the selection of parameters to be estimated can be based on Question 2, parameters to be estimated can be based on Question 2, which is amenable to theoretical analysis. If the expected errors in estimation of any of the parameters, or any linear combination of the parameters, are extremely large, then that parameter or set of parameters can be judged as not identifiable. In such a case, steps may be taken to reduce the number of unknown parameters. In summary, the reservoir history matching problem is a difficult parameter estimation problem, and understanding the relationship between the unknown parameters and the measured data is essential to obtaining meaningful estimates of the reservoir properties. Quantitative studies regarding the accuracy of estimates for single-phase history matching problems have been reported by Shah et al. and Dogru et al. Shah et al,. investigated the optimal level of zonation for use with 1D single-phase (oil) situations. SPEJ P. 697


SPE Journal ◽  
2019 ◽  
Vol 25 (01) ◽  
pp. 139-161 ◽  
Author(s):  
Jincong He ◽  
Albert C. Reynolds ◽  
Shusei Tanaka ◽  
Xian-Huan Wen ◽  
Jairam Kamath

Summary A common pitfall in probabilistic history matching is omitting the local variation of spatial uncertainties and falsely generalizing the learning from local data to the entire field. This can lead to radical overestimation of uncertainty reduction and bad reservoir-management decisions. In this paper, we propose a methodology to quantify and correct for the error that arises from the omission of local variation in probabilistic history matching. Most performance metrics in an oil field, such as the original oil in place (OOIP) and the estimated ultimate recovery (EUR), are field-scale objective functions that depend on properties (e.g., porosity) over the entire field. On the other hand, many measurement data from wells [e.g., bottomhole pressure (BHP)] are mainly sensitive to the reservoir properties near the locations where they are measured, and thus they are susceptible to local variations of reservoir properties. Calibrating field-scale objective functions to local well data without properly characterizing the local variation can overestimate the uncertainty reduction of field-scale objective functions. In this paper, we derived formulas to quantify errors in the posterior cumulative distribution functions (CDFs) of the objective functions resulting from the omission of local variation. We also provide a way to correct for the error and to recover the true posterior CDFs. Through theoretical derivation, we show that the modeling error that arises from the omission of local variation is dependent on the magnitude of the global and local variations of the uncertain properties (e.g., porosity). The larger the local variation relative to the global variation, the larger the error in the estimated posterior distributions. The error also depends on the variogram of the local variation and the detection range of the data. The error is larger for cases with a long variogram for the local variation and a short data-detection range. In addition, the modeling errors for different measurement data points can be highly correlated even when the measurement errors for these data are independent. To correct for this modeling error, analytical and empirical formulas are proposed that have been shown to greatly improve the accuracy of the posterior distributions in a number of cases. To the best of our knowledge, this is the first time that the modeling error from the omission of local variation in the probabilistic history-matching process has been quantified and corrected. The methodology proposed could help improve the reliability of the result from probabilistic history matching.


2019 ◽  
Vol 7 (4) ◽  
pp. SL19-SL36
Author(s):  
Gabriel L. Machado ◽  
Garrett J. Hickman ◽  
Maulin P. Gogri ◽  
Kurt J. Marfurt ◽  
Matthew J. Pranter ◽  
...  

Over the past eight years, north-central Oklahoma has experienced a significant increase in seismicity. Although the disposal of large volumes of wastewater into the Arbuckle Group basement system has been statistically correlated to this increased seismicity, our understanding of the actual mechanisms involved is somewhat superficial. To address this shortcoming, we initiated an integrated study to characterize and model the Arbuckle-basement system to increase our understanding of the subsurface dynamics during the wastewater-disposal process. We constructed a 3D geologic model that integrates 3D seismic data, well logs, core measurements, and injection data. Poststack-data conditioning and seismic attributes provided images of faults and the rugose top of the basement, whereas a modified-Hall analysis provided insights into the injection behavior of the wells. Using a Pareto-based history-matching technique, we calibrated the 3D models using the injection rate and pressure data. The history-matching process showed the dominant parameters to be formation-water properties, permeability, porosity, and horizontal anisotropy of the Arbuckle Group. Based on the pressure buildup responses from the calibrated models, we identified sealing and conductive characteristics of the key faults. Our analysis indicates the average porosity and permeability of Arbuckle Group to be approximately 7% and 10 mD, respectively. The simulation models also showed pockets of nonuniform and large pressure buildups in these formations, indicating that faults play an important role in fluid movement within the Arbuckle Group basement system. As one of the first integrated investigations conducted to understand the potential hydraulic coupling between the Arbuckle Group and the underlying basement, we evaluate the need for improved data recording and additional data collection. In particular, we recommend that operators wishing to pursue this type of analysis record their injection data on a daily rather than on an averaged basis. A more quantitative estimation of reservoir properties requires the acquisition of P-wave and dipole sonic logs in addition to the commonly acquired triple-combo logs. Finally, to better quantify flow units with the disposal reservoir, we recommend that operators acquire sufficient core to characterize the reservoir heterogeneity.


1978 ◽  
Vol 18 (03) ◽  
pp. 219-228 ◽  
Author(s):  
Shah Shah ◽  
G.R. Gavalas ◽  
J.H. Seinfeld

SHAH, SHAH* CALIFORNIA INSTITUTE OF TECHNOLOGY, PASADENA, CALIF. GAVALAS, G.R., CALIFORNIA INSTITUTE OF TECHNOLOGY, PASADENA, CALIF. MEMBER SPE-AIME SEINFELD, J.H., CALIFORNIA INSTITUTE OF TECHNOLOGY, PASADENA, CALIF. MEMBER SPE-AIME Abstract The accuracy of the porosity and permeability estimates obtained in reservoir history matching is investigated using covariance analysis. The estimate covariance matrix is obtained for the following cases:estimation of all individual grid properties,parametrization using sensitivity vectors,parametrization by zonation, andBayesian estimation. The trace of the covariance matrix used as a measure of the over-all accuracy is studied as a /unction of the number of unknown parameters, and a procedure for selecting the parameters, and a procedure for selecting the optimum parametrization is developed. Numerical calculations with a one-dimensional reservoir are used to illustrate the theory. Introduction The problem of estimating parameters in mathematical models of petroleum reservoirs (called history matching) is notoriously difficult. Whereas the estimation problem can be posted straight-forwardly, often it is impossible to obtain meaningful parameter estimates. The principal difficulty is that there are usually more unknown parameters than data points and that the data are parameters than data points and that the data are not sensitive enough to changes in the parameters. Several early and important studies developed the methodology required to treat history matching as a nonlinear regression problem and investigated associated computational problems. More recently, the introduction of methods of optimal control resulted in improved algorithms for automatic history matching. The, actual reliability of the property estimates was approached initially in a property estimates was approached initially in a qualitative fashion and more recently was studied quantitatively. The last two studies treat a small number of unknown parameters from single-well data. In principle, the case of a more detailed reservoir model with data from several wells can be analyzed in the same way, but requires more tedious computations. A straightforward approach to history matching is to assume that rock properties in each grid block used in the numerical solution of the reservoir simulation equations are unknown parameters. In most practical situations, this approach leads to a large number of unknowns. Evidently, the reservoir model must include only a modest number of unknown parameters for meaningful history matching. parameters for meaningful history matching. Replacing one reservoir model with arbitrarily varying properties with a model in which rock properties are determined by a limited number of properties are determined by a limited number of parameters henceforth will be called parameters henceforth will be called parametrization. The traditional approach to reducing the parametrization. The traditional approach to reducing the number of unknown parameters is zonation of the reservoir, in which properties are assumed the same over regions encompassing several grid blocks. In zonation and other types of parametrization that might be used to reduce the number of unknowns, the error in property estimates has two components. One error is because of the parametrization itself and generally decreases as the number of unknowns increases. The other error is proportional to the measurement error and increases with an increasing number of unknowns. The total error reaches a minimum at some intermediate level of parametrization, which can be regarded as the parametrization, which can be regarded as the optimum level. The problem of selecting an optimum number of parameters has been approached only qualitatively before. One of the main objectives of this study is to develop a practical and quantitative procedure for determining the optimum level of parametrization. An alternative to parametrization is to use prior geological information in the form of a prior probability density of the reservoir properties probability density of the reservoir properties considered as random variables. A form of Bayesian estimation then can be employed to determine the unknown properties. One question that arises when applying this procedure is the effect of error in the prior statistics employed. In a previous study, prior statistics employed. In a previous study, we studied various practical aspects of Bayesian estimation concerning the reservoir problem and compared the Bayesian method with the zonation method. SPEJ


2020 ◽  
Vol 12 (1) ◽  
pp. 1533-1540
Author(s):  
Si Yuanlei ◽  
Li Maofei ◽  
Liu Yaoning ◽  
Guo Weihong

AbstractTransient electromagnetic method (TEM) is often used in urban underground space exploration and field geological resource detection. Inversion is the most important step in data interpretation. Because of the volume effect of the TEM, the inversion results are usually multi-solvable. To reduce the multi-solvability of inversion, the constrained inversion of TEM has been studied using the least squares method. The inversion trials were performed using two three-layer theoretical geological models and one four-layer theoretical geological model. The results show that one-dimensional least squares constrained inversion is faster and more effective than unconstrained inversion. The induced electromotive force attenuation curves of the inversion model indicate that the same attenuation curve may be used for different geological conditions. Therefore, constrained inversion using known geological information can more accurately reflect the underground geological information.


2020 ◽  
Vol 25 (2) ◽  
pp. 29
Author(s):  
Desmond Adair ◽  
Aigul Nagimova ◽  
Martin Jaeger

The vibration characteristics of a nonuniform, flexible and free-flying slender rocket experiencing constant thrust is investigated. The rocket is idealized as a classic nonuniform beam with a constant one-dimensional follower force and with free-free boundary conditions. The equations of motion are derived by applying the extended Hamilton’s principle for non-conservative systems. Natural frequencies and associated mode shapes of the rocket are determined using the relatively efficient and accurate Adomian modified decomposition method (AMDM) with the solutions obtained by solving a set of algebraic equations with only three unknown parameters. The method can easily be extended to obtain approximate solutions to vibration problems for any type of nonuniform beam.


2021 ◽  
Author(s):  
Yifei Xu ◽  
Priyesh Srivastava ◽  
Xiao Ma ◽  
Karan Kaul ◽  
Hao Huang

Abstract In this paper, we introduce an efficient method to generate reservoir simulation grids and modify the fault juxtaposition on the generated grids. Both processes are based on a mapping method to displace vertices of a grid to desired locations without changing the grid topology. In the gridding process, a grid that can capture stratigraphical complexity is first generated in an unfaulted space. The vertices of the grid are then displaced back to the original faulted space to become a reservoir simulation grid. The resulting reversely mapped grid has a mapping structure that allows fast and easy fault juxtaposition modification. This feature avoids the process of updating the structural framework and regenerating the reservoir properties, which may be time-consuming. To facilitate juxtaposition updates within an assisted history matching workflow, several parameterized fault throw adjustment methods are introduced. Grid examples are given for reservoirs with Y-faults, overturned bed, and complex channel-lobe systems.


2021 ◽  
Author(s):  
Elizabeth Ruiz ◽  
Brandon Thibodeaux ◽  
Christopher Dorion ◽  
Herman Mukisa ◽  
Majid Faskhoodi ◽  
...  

Abstract Optimized geomodeling and history matching of production data is presented by utilizing an integrated rock and fluid workflow. Facies identification is performed by use of image logs and other geological information. In addition, image logs are used to help define structural geodynamic processes that occurred in the reservoir. Methods of reservoir fluid geodynamics are used to assess the extent of fluid compositional equilibrium, especially the asphaltenes, and thereby the extent of connectivity in these facies. Geochemical determinations are shown to be consistent with measurements of compositional thermodynamic equilibrium. The ability to develop the geo-scenario of the reservoir, the coherent evolution of rock and contained fluids in the reservoir over geologic time, improves the robustness of the geomodel. In particular, the sequence of oil charge, compositional equilibrium, fault block throw, and primary biogenic gas charge are established in this middle Pliocene reservoir with implications for production, field extension,and local basin exploration. History matching of production data prove the accuracy of the geomodel; nevertheless, refinements to the geomodel and improved history matching were obtained by expanded deterministic property estimation from wireline log and other data. Theearly connection of fluid data, both thermodynamic and geochemical, with relevant facies andtheir properties determination enables a more facile method to incorporate this data into the geomodel. Logging data from future wells in the field can be imported into the geomodel allowingdeterministic optimization of this model long after production has commenced. While each reservoir is unique with its own idiosyncrasies, the workflow presented here is generally applicable to all reservoirs and always improves reservoir understanding.


Author(s):  
Abd El-Maseh, M. P

<p>In this paper, the Bayesian estimation for the unknown parameters for the bivariate generalized exponential (BVGE) distribution under Bivariate censoring type-I samples with constant stress accelerated life testing (CSALT) are discussed. The scale parameter of the lifetime distribution at constant stress levels is assumed to be an inverse power law function of the stress level. The parameters are estimated by Bayesian approach using Markov Chain Monte Carlo (MCMC) method based on Gibbs sampling. Then, the numerical studies are introduced to illustrate the approach study using samples which have been generated from the BVGE distribution.</p>


Sign in / Sign up

Export Citation Format

Share Document