Efficient Design of Reservoir Simulation Studies for Development and Optimization

2007 ◽  
Vol 10 (06) ◽  
pp. 629-637 ◽  
Author(s):  
Subhash Kalla ◽  
Christopher David White

Summary Development studies examine geologic, engineering, and economic factors to formulate and optimize production plans. If there are many factors, these studies are prohibitively expensive unless simulation runs are chosen efficiently. Experimental design and response models improve study efficiency and have been widely applied in reservoir engineering. To approximate nonlinear oil and gas reservoir responses, designs must consider factors at more than two levels—not just high and low values. However, multilevel designs require many simulations, especially if many factors are being considered. Partial factorial and mixed designs are more efficient than full factorials, but multilevel partial factorial designs are difficult to formulate. Alternatively, orthogonal arrays (OAs) and nearly-orthogonal arrays (NOAs) provide the required design properties and can handle many factors. These designs span the factor space with fewer runs, can be manipulated easily, and are appropriate for computer experiments. The proposed methods were used to model a gas well with water coning. Eleven geologic factors were varied while optimizing three engineering factors. An NOA was specified with three levels for eight factors and four levels for the remaining six factors. The proposed design required 36 simulations compared to 26,873,856 runs for a full factorial design. Kriged response surfaces are compared to polynomial regression surfaces. Polynomial-response models are used to optimize completion length, tubinghead pressure, and tubing diameter for a partially penetrating well in a gas reservoir with uncertain properties. OAs, Hammersley sequences (HSs), and response models offer a flexible, efficient framework for reservoir simulation studies. Complexity of Reservoir Studies Reservoir studies require integration of geologic properties, drilling and production strategies, and economic parameters. Integration is complex because parameters such as permeability, gas price, and fluid saturations are uncertain. In exploration and production decisions, alternatives such as well placement, artificial lift, and capital investment must be evaluated. Development studies examine these alternatives, as well as geologic, engineering, and economic factors to formulate and optimize production plans (Narayanan et al. 2003). Reservoir studies may require many simulations to evaluate the many factor effects on reservoir performance measures, such as net present value (NPV) and breakthrough time. Despite the exponential growth of computer memory and speed, computing accurate sensitivities and optimizing production performance is still expensive, to the point that it may not be feasible to consider all alternative models. Thus, simulation runs should be chosen as efficiently as possible. Experimental design addresses this problem statistically, and along with response models, it has been applied in engineering science (White et al. 2001; Peng and Gupta 2004; Peake et al. 2005; Sacks et al. 1989a) toMinimize computational costs by choosing a small but statistically representative set of simulation runs for predicting responses (e.g., recovery)Decrease expected error compared with nonoptimal simulation designs (i.e., sets of sample points)Evaluate sensitivity of responses to varying factorsTranslate uncertainty in input factors to uncertainty in predicted performance (i.e., uncertainty analysis)Estimate value of information to focus resources on reducing uncertainty in factors that have the most significant effect on response uncertainty to help optimize engineering factors.

Author(s):  
David A. Romero ◽  
Cristina H. Amon ◽  
Susan Finger

In order to reduce the time and resources devoted to design-space exploration during simulation-based design and optimization, the use of surrogate models, or metamodels, has been proposed in the literature. Key to the success of metamodeling efforts are the experimental design techniques used to generate the combinations of input variables at which the computer experiments are conducted. Several adaptive sampling techniques have been proposed to tailor the experimental designs to the specific application at hand, using the already-acquired data to guide further exploration of the input space, instead of using a fixed sampling scheme defined a priori. Though mixed results have been reported, it has been argued that adaptive sampling techniques can be more efficient, yielding better surrogate models with less sampling points. In this paper, we address the problem of adaptive sampling for single and multi-response metamodels, with a focus on Multi-stage Multi-response Bayesian Surrogate Models (MMBSM). We compare distance-optimal latin hypercube sampling, an entropy-based criterion and the maximum cross-validation variance criterion, originally proposed for one-dimensional output spaces and implemented in this paper for multi-dimensional output spaces. Our results indicate that, both for single and multi-response surrogate models, the entropy-based adaptive sampling approach leads to models that are more robust to the initial experimental design and at least as accurate (or better) when compared with other sampling techniques using the same number of sampling points.


2021 ◽  
Author(s):  
Ricko Rizkiaputra ◽  
Satrio Goesmiyarso ◽  
Jufenilamora Nurak ◽  
Krishna Pratama Laya ◽  
Dimmas Ramadhan ◽  
...  

Abstract Even though the downhole gauges and wellhead meter (wet gas meter) have been invented decades ago, having them installed in every wells are still considered as a luxury for many companies. However, does this view still reasonable for a tight gas reservoir let alone located in a remote area? This study will describe the benefit of having both equipment for reservoir management practice in one of the biggest tight gas reservoirs in Indonesia. Generally, reservoir management is an iterative process that incorporates the analysis of reservoir characterization, development plan, implementation, and monitoring. There are many analyses from the reservoir management process that can be performed using above mentioned equipment. Several analyses have been performed, such as: (i) Interference Test and Pressure Transient Analysis (PTA) after well is completed; (ii) Evolution of connected volume since early production until present day using Dynamic Material Balance (DMB); (iii) Flow regime and reservoir properties using Rate Transient Analysis (RTA); and (iv) Reservoir simulation: regular model update and project opportunity identification. In this study, the above-mentioned analyses are performed in one of the massive tight gas reservoir in Indonesia that is located in the remote area. Having a complete reservoir surveillance tools such as downhole gauges and wellhead meter on each wells is beneficial for reservoir management practice. Precious subsurface data can be obtained anytime without having to wait for equipment mobilization to location. This is critical for managing tight gas reservoir which usually demands robust subsurface data to reduce its uncertainties. There are several findings based on the above mentioned analyses, such as: (i) The interference test indicates there is reservoir connectivity among the production wells; (ii) The PTA indicates that the reservoir has tight properties, although longer buildup/observation time is still needed to better understand the reservoir characteristics in wider scale; (iii) The DMB analysis can be performed even in daily basis to provide the insight on connected gas initial in place (GIIP) evolution through time, as in this case it still shows an increasing GIIP through time which is suspected due to the transient flow regime on the wells; (iv) The RTA can also be performed in similar fashion, if it is combine with other analyses, this analysis able to provide a multi-scale reservoir properties investigation from near wellbore to far field and flow period observation (boundary observation) through time, as in this case the reservoir properties is tight and flow is still in transient period; (v) It increases robustness of reservoir simulation update since it is supported by many analyses, as such, series of hopper can be confidently presented to management, as in this case a project of well stimulation (Acid Fracturing) has been performed successfully and opportunity of further field development plan can be identified. This paper shows that, for the tight reservoir in the remote location, having each well equipped with downhole gauges and dedicated wellhead meter is significantly increasing the robustness of reservoir management process. Thus, providing economic optimization for the managed asset. Regarding the capital that is invested at the beginning, it will simply pay out quickly, looking at the time and resources that need to be spent for having equipment on site.


2010 ◽  
Vol 13 (02) ◽  
pp. 306-312 ◽  
Author(s):  
Medhat M. Kamal ◽  
Yan Pan

Summary A new well-testing-analysis method is presented. The method allows for calculating the absolute permeability of the formation in the area influenced by the test and the average saturations in this area. Traditional pressure-transient-analysis methods have been developed and are completely adequate for single-phase flow in the reservoir. The proposed method is not intended for these conditions. The method applies to two-phase flow in the reservoir (oil and water or oil and gas). Future expansion to three-phase flow is possible. Current analysis methods yield only the effective permeability for the dominant flowing phase and the "total mobility" of all phases. The new method uses the surface-flow rates and fluid properties of the flowing phases and the same relative permeability relations used in characterizing the reservoir and predicting its future performance. The method has been verified by comparing the results from analyzing several synthetic tests that were produced by a numerical simulator with the input values. Use of the method with field data is also described. The new method could be applied wherever values of absolute permeability or fluid saturations are used in predicting well and reservoir performance. Probably, the major impact would be in reservoir simulation studies in which the need to transform welltesting permeability to simulator input values is eliminated and additional parameters (fluids saturations) become available to help history match the reservoir performance. This work will also help in predicting well flow rates and in situations in which absolute permeability changes with time (e.g., from compaction). Results showed that the values of absolute permeability in water/oil cases could be reproduced within 3% of the correct values and within 5% of the correct values in gas/oil cases. Errors in calculating the fluid saturations were even lower. One of the main advantages of this method is that the relative permeability curves used in calculating the absolute permeability and average saturations, and later on in numerical reservoir simulation studies, are the same, ensuring a consistent process. The proposed method does not address the question of which set of relative permeability curves should be used. This question should be answered by the engineer performing the reservoir engineering/simulation study. The proposed method mainly is meant to provide consistent results for predicting the reservoir performance using whatever relative permeability relations that are being used in the reservoir simulation model. The method does not induce any additional errors in determining the average saturation or absolute permeability over what may result from using these specific relative permeability curves in the reservoir simulation study. The impact of this study will be to expand the use of information already contained in transient data and surface flow rates of all phases. The results will provide engineers with additional parameters to improve and speed up history matching and the prediction of well and reservoir performances in just about all studies.


1994 ◽  
Vol 31 (4) ◽  
pp. 545-557 ◽  
Author(s):  
Warren F. Kuhfeld ◽  
Randall D. Tobias ◽  
Mark Garratt

The authors suggest the use of D-efficient experimental designs for conjoint and discrete-choice studies, discussing orthogonal arrays, nonorthogonal designs, relative efficiency, and nonorthogonal design algorithms. They construct designs for a choice study with asymmetry and interactions and for a conjoint study with blocks and aggregate interactions.


2009 ◽  
Author(s):  
Ilya D. Mishev ◽  
Bret Loneil Beckner ◽  
Serge A. Terekhov ◽  
Nelli Fedorova

2013 ◽  
Vol 295-298 ◽  
pp. 3213-3218
Author(s):  
Ping Yue ◽  
Xiao Fan Chen ◽  
Fu Li ◽  
Xiao Ju Zhuang

Pressure maintenance such as gas cycling which re-injection of produced lean gas into the reservoir are common practices and effective measures to improve the development performance of gas condensate reservoir. However, effectively control of gas channeling and control water cut of horizontal well in gas reservoir with aquifer both the keys to improve the development effectiveness. There are many factors that respond for gas channeling and water cresting, single factor experiment is difficult to obtain reliable result. Orthogonal experimental design, also called Taguchi experimental design, is widely used in various fields of scientific research. This paper applied Taguchi experimental design, an L9 orthogonal was used to the X gas reservoir numerical simulation to carry out sensitivity analysis among four factors which are the well pattern, gas cycle time, gas offtake, recirculating ratio.


2009 ◽  
Author(s):  
Gael Gibert ◽  
Francois Michel Gouth ◽  
Rashed Noman ◽  
Abdulla Ahmad Al-Suwaidi

Sign in / Sign up

Export Citation Format

Share Document