Proxy-Based Work Flow for a Priori Evaluation of Data-Acquisition Programs

SPE Journal ◽  
2016 ◽  
Vol 21 (04) ◽  
pp. 1400-1412 ◽  
Author(s):  
Jincong He ◽  
Jiang Xie ◽  
Pallav Sarma ◽  
Xian-Huan Wen ◽  
Wen H. Chen ◽  
...  

Summary Data-acquisition programs, such as surveillance and pilot, play an important role in reservoir management, and are crucial for minimizing subsurface risks and improving decision quality. Optimal design of the data-acquisition plan requires predicting the performance (e.g., in terms of the expected amount of uncertainty reduction in an objective function) of a given design before it is implemented. Because the data from the acquisition program are uncertain at the time of the analysis, multiple history-matching runs are required for different plausible realizations of the observed data to evaluate the expected effectiveness of the program in reducing uncertainty. As such, the computational cost may be prohibitive because the number of reservoir simulations needed for the multiple history-matching runs would be substantial. This paper proposes a framework on the basis of proxies and rejection sampling (filtering) to perform the multiple history-matching runs with a manageable number of reservoir simulations. The work flow proposed does not depend on the linear Gaussian assumption that is a common, yet questionable, assumption in existing methods. The work flow also enables both qualitative and quantitative analysis of a surveillance plan. Qualitatively, heavy-hitter alignment analysis for the objective function and the observed data provides actionable measures for screening different surveillance designs. Quantitatively, the evaluation of expected uncertainty reduction from different surveillance plans allows for optimal design and selection of surveillance plans.

SPE Journal ◽  
2018 ◽  
Vol 23 (02) ◽  
pp. 428-448 ◽  
Author(s):  
Jincong He ◽  
Pallav Sarma ◽  
Eric Bhark ◽  
Shusei Tanaka ◽  
Bailian Chen ◽  
...  

Summary Data-acquisition programs, such as surveillance and pilots, play an important role in minimizing subsurface risks and improving decision quality for reservoir management. For design optimization and investment justification of these programs, it is crucial to be able to quantify the expected uncertainty reduction and the value of information (VOI) attainable from a given design. This problem is challenging because the data from the acquisition program are uncertain at the time of the analysis. In this paper, a method called ensemble-variance analysis (EVA) is proposed. Derived from a multivariate Gaussian assumption between the observation data and the objective function, the EVA method quantifies the expected uncertainty reduction from covariance information that is estimated from an ensemble of simulations. The result of EVA can then be used with a decision tree to quantify the VOI of a given data-acquisition program. The proposed method has several novel features compared with existing methods. First, the EVA method directly considers the data/objective-function relationship. Therefore, it can handle nonlinear forward models and an arbitrary number of parameters. Second, for cases when the multivariate Gaussian assumption between the data and objective function does not hold, the EVA method still provides a lower bound on expected uncertainty reduction, which can be useful in providing a conservative estimate of the surveillance/pilot performance. Finally, EVA also provides an estimate of the shift in the mean of the objective-function distribution, which is crucial for VOI calculation. In this paper, the EVA work flow for expected-uncertainty-reduction quantification is described. The result from EVA is benchmarked with recently proposed rigorous sampling methods, and the capacity of the method for VOI quantification is demonstrated for a pilot-analysis problem using a field-scale reservoir model.


SPE Journal ◽  
2013 ◽  
Vol 19 (04) ◽  
pp. 621-635 ◽  
Author(s):  
Cheng Dai ◽  
Heng Li ◽  
Dongxiao Zhang

Summary Reservoir simulations involve a large number of formation and fluid parameters, many of which are subject to uncertainties owing to the combination of spatial heterogeneity and insufficient measurements. Accurately quantifying the impact of varying parameters on simulation models can reveal the importance of the parameters, which helps in designing field-characterization strategies and determining parameterization for history matching. Compared with the commonly used local sensitivity analysis (SA), global SA considers the whole variation range of the parameters and can thus provide more-complete information. However, the traditional global sensitivity analysis that is derived from Monte Carlo simulation (MCS) is computationally too demanding for reservoir simulations. In this study, we propose an alternative approach that is both accurate and efficient. In the proposed approach, the model outputs such as pressure and reservoir production quantities are expressed by polynomial chaos expansions (PCEs). The probabilistic collocation method is used to determine the coefficients of the polynomial expansions by solving outputs at different sets of collocation points by means of the original partial-differential equations. Then, a proxy is constructed with such coefficients. Accurate statistical sensitivity indices of the uncertainty parameters can be obtained by running the proxy. We validate the approach with 2D examples by comparing with the MCS-based global SA. It is found that with only a small fraction of the computational cost required by the MCS approach, the new approach gives accurate global sensitivity for each parameter. The proposed approach is also demonstrated on a large-scale 3D black-oil model, for which the MCS-based global SA is found to be computationally infeasible. It is found that the developed approach possesses the following key advantages: It requires a much smaller number of reservoir simulations for accurate global SA; it is nonintrusive and can be implemented with existing codes or simulators; and it can accommodate arbitrary distributions of parameters encountered in realistic geological situations.


SPE Journal ◽  
2013 ◽  
Vol 18 (03) ◽  
pp. 495-507 ◽  
Author(s):  
D. Y. Ding ◽  
F.. McKee

Summary Assisted history matching is widely used to constrain reservoir models by integrating well-production data and/or 4D seismic data. However, history matching generally requires a large number of reservoir simulations, which are very CPU time-consuming, in the optimization procedure. In this paper, we present a new technique that allows us to reduce the number of reservoir simulations for the gradient-based optimization method in history matching. The new method is based on the partial separability of the objective function with local components referred to the wells and/or the seismic zones. How to choose a good perturbation design for the gradient computation is the main issue. In this paper, the graph-coloring algorithm is applied to determine “independent” components and parameters for a partially separable function, and analytical test functions are proposed for the selection of perturbation designs. This method can significantly reduce the number of reservoir simulations for a partially separable objective function in history matching.


SPE Journal ◽  
2013 ◽  
Vol 18 (06) ◽  
pp. 1003-1011 ◽  
Author(s):  
Z.. Bouzarkouna ◽  
D.Y.. Y. Ding ◽  
A.. Auger

Summary The net present value (NPV) of a project can be significantly increased by finding the optimal location of non-conventional wells. This optimization problem is nowadays one of the most challenging problems in oil-and gas-field development. Suitable methods to tackle this problem include stochastic optimization algorithms, which are particularly robust and able to deal with complex reservoir geology with high heterogeneities. However, these methods require in general a considerable computational effort in terms of number of reservoir simulations, which are CPU-time-demanding. This paper presents the use of the CMA-ES (covariance matrix adaptation—evolution strategy) optimizer, recognized as one of the most powerful derivative free optimizers, to optimize well locations and trajectories. A local-regression-based metamodel is incorporated into the optimization process in order to reduce the computational cost. The objective function (e.g., the NPV) can usually be split into local components, referring to each of the wells that moreover depends in general on a smaller number of principal parameters, and thus can be modeled as a partially separable function. In this paper, we propose to exploit the partial separability of the objective function into CMA-ES coupled with metamodels by building partially separated metamodels. Thus, different metamodels are built for each well or set of wells, which results in a more accurate modeling. An example is presented. Results show that taking advantage of the partial separability of the objective function leads to a significant decrease in the number of reservoir simulations needed to find the "optimal" well configuration, given a restricted budget of reservoir simulations. The proposed approach is practical and promising to deal with the placement of a large number of wells.


1980 ◽  
Vol 20 (06) ◽  
pp. 521-532 ◽  
Author(s):  
A.T. Watson ◽  
J.H. Seinfeld ◽  
G.R. Gavalas ◽  
P.T. Woo

Abstract An automatic history-matching algorithm based onan optimal control approach has been formulated forjoint estimation of spatially varying permeability andporosity and coefficients of relative permeabilityfunctions in two-phase reservoirs. The algorithm usespressure and production rate data simultaneously. The performance of the algorithm for thewaterflooding of one- and two-dimensional hypotheticalreservoirs is examined, and properties associatedwith the parameter estimation problem are discussed. Introduction There has been considerable interest in thedevelopment of automatic history-matchingalgorithms. Most of the published work to date onautomatic history matching has been devoted tosingle-phase reservoirs in which the unknownparameters to be estimated are often the reservoirporosity (or storage) and absolute permeability (ortransmissibility). In the single-phase problem, theobjective function usually consists of the deviationsbetween the predicted and measured reservoirpressures at the wells. Parameter estimation, orhistory matching, in multiphase reservoirs isfundamentally more difficult than in single-phasereservoirs. The multiphase equations are nonlinear, and in addition to the porosity and absolutepermeability, the relative permeabilities of each phasemay be unknown and subject to estimation. Measurements of the relative rates of flow of oil, water, and gas at the wells also may be available forthe objective function. The aspect of the reservoir history-matchingproblem that distinguishes it from other parameterestimation problems in science and engineering is thelarge dimensionality of both the system state and theunknown parameters. As a result of this largedimensionality, computational efficiency becomes aprime consideration in the implementation of anautomatic history-matching method. In all parameterestimation methods, a trade-off exists between theamount of computation performed per iteration andthe speed of convergence of the method. Animportant saving in computing time was realized insingle-phase automatic history matching through theintroduction of optimal control theory as a methodfor calculating the gradient of the objective functionwith respect to the unknown parameters. Thistechnique currently is limited to first-order gradientmethods. First-order gradient methods generallyconverge more slowly than those of higher order.Nevertheless, the amount of computation requiredper iteration is significantly less than that requiredfor higher-order optimization methods; thus, first-order methods are attractive for automatic historymatching. The optimal control algorithm forautomatic history matching has been shown toproduce excellent results when applied to field problems. Therefore, the first approach to thedevelopment of a general automatic history-matchingalgorithm for multiphase reservoirs wouldseem to proceed through the development of anoptimal control approach for calculating the gradientof the objective function with respect to theparameters for use in a first-order method. SPEJ P. 521^


2021 ◽  
Vol 73 (04) ◽  
pp. 60-61
Author(s):  
Chris Carpenter

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17–19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17-19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction The concept of rate transient analysis (RTA) involves the use of rate and pressure trends of producing wells to estimate properties such as permeability and fracture surface area. While very useful, RTA is an analytical technique and has commensurate limitations. In the complete paper, different RTA motifs are generated using a simulator. Insights from these motif simulations are used to modify simulation parameters to expediate and inform the history- matching process. The simulation history-matching work flow presented includes the following steps: 1 - Set up a simulation model with geologic properties, wellbore and completion designs, and fracturing and production schedules 2 - Run an initial model 3 - Tune the fracture geometries (height and length) to heuristic data: microseismic, frac-hit data, distributed acoustic sensing, or other diagnostics 4 - Match instantaneous shut-in pressure (ISIP) and wellhead pressure (WHP) during injection 5 - Make RTA plots of the real and simulated production data 6 - Use the motifs presented in the paper to identify possible production mechanisms in the real data 7 - Adjust history-matching parameters in the simulation model based on the intuition gained from RTA of the real data 8 -Iterate Steps 5 through 7 to obtain a match in RTA trends 9 - Modify relative permeabilities as necessary to obtain correct oil, water, and gas proportions In this study, the authors used a commercial simulator that fully integrates hydraulic fracturing, wellbore, and reservoir simulation into a single modeling code. Matching Fracturing Data The complete paper focuses on matching production data, assisted by RTA, not specifically on the matching of fracturing data such as injection pressure and fracture geometry (Steps 3 and 4). Nevertheless, for completeness, these steps are very briefly summarized in this section. Effective fracture toughness is the most-important factor in determining fracture length. Field diagnostics suggest considerable variability in effective fracture toughness and fracture length. Typical half-lengths are between 500 and 2,000 ft. Laboratory-derived values of fracture toughness yield longer fractures (propagation of 2,000 ft or more from the wellbore). Significantly larger values of fracture toughness are needed to explain the shorter fracture length and higher net pressure values that are often observed. The authors use a scale- dependent fracture-toughness parameter to increase toughness as the fracture grows. This allows the simulator to match injection pressure data while simultaneously limiting fracture length. This scale-dependent toughness scaling parameter is the most-important parameter in determining fracture size.


2014 ◽  
Vol 11 (2) ◽  
pp. 339-350
Author(s):  
Khadidja Bouali ◽  
Fatima Kadid ◽  
Rachid Abdessemed

In this paper a design methodology of a magnetohydrodynamic pump is proposed. The methodology is based on direct interpretation of the design problem as an optimization problem. The simulated annealing method is used for an optimal design of a DC MHD pump. The optimization procedure uses an objective function which can be the minimum of the mass. The constraints are both of geometrics and electromagnetic in type. The obtained results are reported.


2018 ◽  
Author(s):  
Forlan La Rosa Almeida ◽  
Helena Nandi Formentin ◽  
Célio Maschio ◽  
Alessandra Davolio ◽  
Denis José Schiozer

2021 ◽  
Author(s):  
Carlo Cristiano Stabile ◽  
Marco Barbiero ◽  
Giorgio Fighera ◽  
Laura Dovera

Abstract Optimizing well locations for a green field is critical to mitigate development risks. Performing such workflows with reservoir simulations is very challenging due to the huge computational cost. Proxy models can instead provide accurate estimates at a fraction of the computing time. This study presents an application of new generation functional proxies to optimize the well locations in a real oil field with respect to the actualized oil production on all the different geological realizations. Proxies are built with the Universal Trace Kriging and are functional in time allowing to actualize oil flows over the asset lifetime. Proxies are trained on the reservoir simulations using randomly sampled well locations. Two proxies are created for a pessimistic model (P10) and a mid-case model (P50) to capture the geological uncertainties. The optimization step uses the Non-dominated Sorting Genetic Algorithm, with discounted oil productions of the two proxies, as objective functions. An adaptive approach was employed: optimized points found from a first optimization were used to re-train the proxy models and a second run of optimization was performed. The methodology was applied on a real oil reservoir to optimize the location of four vertical production wells and compared against reference locations. 111 geological realizations were available, in which one relevant uncertainty is the presence of possible compartments. The decision space represented by the horizontal translation vectors for each well was sampled using Plackett-Burman and Latin-Hypercube designs. A first application produced a proxy with poor predictive quality. Redrawing the areas to avoid overlaps and to confine the decision space of each well in one compartment, improved the quality. This suggests that the proxy predictive ability deteriorates in presence of highly non-linear responses caused by sealing faults or by well interchanging positions. We then followed a 2-step adaptive approach: a first optimization was performed and the resulting Pareto front was validated with reservoir simulations; to further improve the proxy quality in this region of the decision space, the validated Pareto front points were added to the initial dataset to retrain the proxy and consequently rerun the optimization. The final well locations were validated on all 111 realizations with reservoir simulations and resulted in an overall increase of the discounted production of about 5% compared to the reference development strategy. The adaptive approach, combined with functional proxy, proved to be successful in improving the workflow by purposefully increasing the training set samples with data points able to enhance the optimization step effectiveness. Each optimization run performed relies on about 1 million proxy evaluations which required negligible computational time. The same workflow carried out with standard reservoir simulations would have been practically unfeasible.


2020 ◽  
Author(s):  
G. Eremyan ◽  
I. Matveev ◽  
G. Shishaev ◽  
V. Rukavishnikov ◽  
V. Demyanov

Sign in / Sign up

Export Citation Format

Share Document