A New Reservoir Simulation System for a Better Reservoir Management

Author(s):  
Didier Lefevre ◽  
Gerard Pellissier ◽  
J-C. Sabathier
2001 ◽  
Author(s):  
B.L. Beckner ◽  
J.M. Hutfilz ◽  
M.B. Ray ◽  
J.F. Tomich

2021 ◽  
Author(s):  
Ricko Rizkiaputra ◽  
Satrio Goesmiyarso ◽  
Jufenilamora Nurak ◽  
Krishna Pratama Laya ◽  
Dimmas Ramadhan ◽  
...  

Abstract Even though the downhole gauges and wellhead meter (wet gas meter) have been invented decades ago, having them installed in every wells are still considered as a luxury for many companies. However, does this view still reasonable for a tight gas reservoir let alone located in a remote area? This study will describe the benefit of having both equipment for reservoir management practice in one of the biggest tight gas reservoirs in Indonesia. Generally, reservoir management is an iterative process that incorporates the analysis of reservoir characterization, development plan, implementation, and monitoring. There are many analyses from the reservoir management process that can be performed using above mentioned equipment. Several analyses have been performed, such as: (i) Interference Test and Pressure Transient Analysis (PTA) after well is completed; (ii) Evolution of connected volume since early production until present day using Dynamic Material Balance (DMB); (iii) Flow regime and reservoir properties using Rate Transient Analysis (RTA); and (iv) Reservoir simulation: regular model update and project opportunity identification. In this study, the above-mentioned analyses are performed in one of the massive tight gas reservoir in Indonesia that is located in the remote area. Having a complete reservoir surveillance tools such as downhole gauges and wellhead meter on each wells is beneficial for reservoir management practice. Precious subsurface data can be obtained anytime without having to wait for equipment mobilization to location. This is critical for managing tight gas reservoir which usually demands robust subsurface data to reduce its uncertainties. There are several findings based on the above mentioned analyses, such as: (i) The interference test indicates there is reservoir connectivity among the production wells; (ii) The PTA indicates that the reservoir has tight properties, although longer buildup/observation time is still needed to better understand the reservoir characteristics in wider scale; (iii) The DMB analysis can be performed even in daily basis to provide the insight on connected gas initial in place (GIIP) evolution through time, as in this case it still shows an increasing GIIP through time which is suspected due to the transient flow regime on the wells; (iv) The RTA can also be performed in similar fashion, if it is combine with other analyses, this analysis able to provide a multi-scale reservoir properties investigation from near wellbore to far field and flow period observation (boundary observation) through time, as in this case the reservoir properties is tight and flow is still in transient period; (v) It increases robustness of reservoir simulation update since it is supported by many analyses, as such, series of hopper can be confidently presented to management, as in this case a project of well stimulation (Acid Fracturing) has been performed successfully and opportunity of further field development plan can be identified. This paper shows that, for the tight reservoir in the remote location, having each well equipped with downhole gauges and dedicated wellhead meter is significantly increasing the robustness of reservoir management process. Thus, providing economic optimization for the managed asset. Regarding the capital that is invested at the beginning, it will simply pay out quickly, looking at the time and resources that need to be spent for having equipment on site.


1998 ◽  
Vol 1 (01) ◽  
pp. 5-11 ◽  
Author(s):  
N.G. Saleri

Summary Managing complexity and technological complexification is a necessity in today's business environment. This paper outlines a method to increase value addition significantly by multidisciplinary reservoir studies. In this context, value addition refers to a positive impact on a business decision. The approach ensures a level of complexification in line both with business questions at hand and the realities of reservoirs. Sparse well control, seismic uncertainties, imperfect geologic models, time constraints, software viruses, and computing hardware limitations represent some common reservoir realities. The process model detailed in the paper uses these apparent shortcomings to moderate (i.e., guide) the level of complexification. Several project examples illustrate the implementation of the process model. The paper is an extension of three previous investigations1–3 that deal with issues of method and uncertainty in reservoir-performance forecasting. Introduction Multidisciplinary teams and data have become the standard 1990's methods to address large-scale reservoir-management issues. Concurrently, reservoir simulation has assumed the role as a "knowledge manager" of ever-growing quantities of information. The paper pursues three basic questions:How can we maximize the value added from integrated reservoir studies,How can we achieve a pragmatic balance between business objectives/timetables and problem complexification, andHow best can we use the technology dividend provided by the explosion of computing power Primarily because of their size, Saudi Arabian fields amplify the significance of these three questions. What has emerged is the realization that reservoir simulation needs to provide a proper demarcation between scientific and business objectives to remain business-relevant. The discussion that follows consists of two main parts. First, we present an analysis of complexity in general and reservoir systems in particular. This is followed by a process model (i.e., parallel planning plus) and a set of principles that link business needs, reservoir realities, and simulation in the context of multidisciplinary studies. The following definitions will facilitate the discussion that follows. Complex (adjective): Composed of interconnected parts. Complexity: The state of being intricate. The degree of interconnection among various parts. Complexification: The process of adding incremental levels of complexity to a system. Detail vs. Dynamic Complexity A vast array of multisourced information makes up reservoir systems (Fig. 1). Reservoir simulation is our attempt to link the "detail complexity" of such a system to the "dynamic complexity"4,5 expected in business decisions. In this regard, a systems engineering perspective to reservoir management is very relevant. Senge4 defines two types of complexity: detail and dynamic. Detail complexity entails defining individual ingredients in fine detail, while dynamic complexity refers to the dynamic, often unpredictable, outcomes of the interactions of the individual components. Senge4 states that "the real leverage in most management situations lies in understanding dynamic complexity, not detail complexity." This is precisely true for many of the questions facing reservoir-management project teams in the industry. When to initiate an EOR project or pattern realignment or how to develop a field are typical dynamic complexity problems. Relative-permeability data, field-management strategies, or wellbore hydraulics are examples of detail complexity. Geologic, geostatistical, and reservoir-simulation models are also examples of detail complexity, but represent higher orders of organization. Interestingly, reservoir-simulation models have a dual function: first, as an organizer of detail complexity, and, second, as a tool for interpreting dynamic complexity (a distinction from geologic models). Technological complexification is the process of adding incremental levels of detail complexity to a system to represent its dynamic complexity more rigorously. Each one of the components depicted in Fig. 1 offers an avenue of complexification. Perhaps ironically, every component also carries an element of uncertainty. New technologies are adding significantly to the detail complexity available to multidisciplinary teams. One can see that advances in computing technology, for instance, play a role in the cycle of complexification that Fig. 2 shows. As we acquire more computing power, we can build more complex models, which will further delineate the questions being addressed, calling for more computing power, and so on. The real question, however, is whether we are in fact getting a better answer to the questions posed. Or, alternatively, are we making a difference? Multidisciplinary studies are vulnerable to the tendency towards maximal detail complexity. As one of the constituent disciplines (e.g., seismic, geostatistics) produces a more detailed reservoir representation, the pressure mounts for the other disciplines to match the level of complexification in their respective areas. However, for many reservoir problems, we may have a nonlinear relationship between dynamic and detail complexity (Fig. 3). As the number of detail complexity elements rise, the number of interactions among the elements proliferate. Any one of these interactions can be a show stopper. For example, reservoir-simulation models constructed at the detail level (i.e., scale) of geocellular models can become numerically unstable or prohibitively central-processing-unit (CPU) intensive - either way, a nonsolution. Complexification vs. Error Expectations The reservoir system depicted in Fig. 1 does not represent a controlled data environment; i.e., we are not operating in a setting where we can control the quality and quantity (sufficiency) of data. Therefore, in reservoir systems, the concept of "garbage in/garbage out," when taken literally, is an oxymoron. There is always some contamination (error or uncertainty) in one of the detail complexity elements. Thus, we need to redefine our mission as "given the data environment as is, what is an acceptable error, and what is an appropriate level of complexification?"


1986 ◽  
Vol 26 (1) ◽  
pp. 397
Author(s):  
A.B. Kaliszewski

The Hutton reservoir in the Merrimelia Field (Cooper-Eromanga Basin) was the subject of a 3-D reservoir simulation study. The primary objective of the study was to develop a reservoir management tool for evaluating the performance of the field under various depletion options.The study confirmed that the ultimate oil recovery from this strong water drive reservoir was not adversely affected by increasing total fluid offtake rate. However, any decisions regarding changes to the depletion scheme such as increasing production rates, if based solely on computer simulation results, should be viewed with caution. Careful monitoring of any changes to the depletion philosophy and checking of actual data against simulation predictions are essential to ensure that oil production rate and ultimate recovery are optimised.The model assisted in evaluating the economics of development drilling. While the simulation results are dependent on the validity of geological mapping, the model was useful in confirming that, due to very high transmissibility in the Hutton reservoir, additional wells would only accelerate production rather than increase ultimate recovery. The issue of drilling wells thus became one of balancing the benefits of accelerating production against the geological risk associated with that well.Interaction between the reservoir engineer and various disciplines, particularly development geology, is critical in the development and application of a good working simulation model. This was found to be especially important during the history matching phase in the study. If engineers and development geologists can learn more of the others' discipline and appreciate the role that each has to play in simulation studies, the validity of such models can only be improved.The paper addresses a number of the pitfalls commonly encountered in application of reservoir simulation results.


2001 ◽  
Vol 4 (02) ◽  
pp. 114-120 ◽  
Author(s):  
V.J. Zapata ◽  
W.M. Brummett ◽  
M.E. Osborne ◽  
D.J. Van Nispen

Summary One of the most perplexing and difficult challenges in the industry is deciding how to develop a new oil or gas field. It is necessary to estimate recoverable reserves, design the most efficient exploitation strategy, decide where and when to drill wells and install surface facilities, and predict the rate of production. This requires a clear understanding of energy distribution and fluid movements throughout the entire system, under any given operational scenario or market-demand situation. Even after a reservoir-development plan is selected, there are many possible facility designs, each with different investment and operating costs. An important, but not always considered, fact is that each facility scheme could result in different future production rates owing to various types, sizes, and configurations of fluid-flow facilities. Selecting the best design for the asset requires the most accurate production forecasts possible over the forecast life cycle. No other single technology has the ability to provide this insight, as well as tightly coupled reservoir and facility simulation, because it combines all pertinent geological and engineering data into a single, comprehensive, dynamic model of the entire oilfield flow system. An integrated oilfield simulation system accounts for all dynamic flow effects and provides a test environment for quickly and accurately comparing alternative designs. This paper provides a brief background of this technology and gives a review of a major development project where it is currently being applied. Finally, we describe some recent significant advances in the technology that make it more stable, accurate, and rigorous. Introduction Finite-difference reservoir simulation is widely used to predict production performance of oil and gas fields. This is usually done in a "stand-alone" mode, where individual well performance is commonly calculated from pregenerated multiphase wellbore flow tables that cover various ranges of wellhead and bottomhole pressures, gas/oil ratios (GOR's) and water/oil ratios (WOR's). The reservoir simulator determines the predicted production rate from these tables, normally assuming a fixed wellhead pressure and using a flowing bottomhole pressure calculated by the reservoir simulator. With this scheme it is not possible to consider the changing flow-resistance effects of the piping system as various fluids merge or split in the surface network. Neglecting this interaction of the surface network can, in many cases, introduce substantial errors into predicted performance. Basing multimillion- (in some cases, billion-) dollar exploitation designs on performance predictions that are suboptimal can be very detrimental to the asset's long-range profitability. To help eliminate this problem, considerable attention is being given to coupling reservoir simulators and multiphase facility network simulators to improve the accuracy of forecasting. Landscape Surface-network simulation technology was first introduced in 1976.1 Although successfully applied in selected cases, the concept was not widely adopted because of the excessive additional computing demands on computers of that era. In those earlier applications, the time consumed by the facility calculations could actually exceed the reservoir calculations.2,3 As computer performance has increased by orders of magnitude, this has become less of an issue. Reservoir model sizes have increased dramatically with much finer grids that take advantage of the increased computer power, but there was no need for a corresponding increase in the size of the facility models. Today, with tightly coupled reservoir/wellbore/surface models, the facility calculations are a fairly small part of the overall computing time and there is considerable effort in the industry to build these types of systems.4,5 Chevron's current tightly coupled oilfield simulation system is CHEARS®***/PIPESOFT-2™****. CHEARS® is a fully implicit 3D reservoir simulator with black-oil, compositional, thermal, miscible, and polymer formulations. It has fully implicit dual porosity, dual permeability options, and unlimited multiple-level local grid refinement. PIPESOFT-2™ is a comprehensive multiphase wellbore/surface-network simulator. It has black-oil, compositional, CO2, steam, and non-Newtonian fluid capabilities. It can solve any type of complex nested looping, both surface and subsurface. The coupling is done at the wellbore completion interval, which is the natural domain boundary between the flow systems. We refer to our implementation as "tightly coupled" because information is dynamically exchanged directly between the simulators without any intermediate intervention. A simple representation of the interaction between the simulators is shown in Fig. 1. Gorgon Field Example The following is an example of how this technology is currently being used. The Gorgon field is a Triassic gas accumulation estimated to contain over 20 Tscf of gas, located 130 km offshore northwest Australia in 300 m of water (Fig. 2). It is currently undergoing development studies for an LNG project. Field and Model Description. The field is 45 km long and 9 km wide, and it comprises more than 2000 m of Triassic fluvial Mungaroo formation in angular discordance with a Jurassic-age unconformity. It has been subdivided into 11 vertical intervals (or zones) on the basis of regional sequence boundaries and depositional systems. These 11 zones were first modeled individually with an object-based modeling technique before being stacked into a 715-layer full-field geologic model. This model was subsequently scaled up to a 46-layer reservoir simulation model, reducing the size of the model from 4.5 million cells to 290,000 cells. While the scaleup process preserved the original 11 zone boundaries, the majority of the layers were located in regions identified as key flow units. In addition to vertical subdivision, seismic and appraisal well data suggest structural compartmentalization, resulting in six major fault blocks. After deactivating appropriate cells, the final simulation model contained 50,000 active cells and was initialized with 35 independent pressure regions. Each of these regions corresponds to a single zone in a single fault block.


2000 ◽  
Author(s):  
A. Bakulin ◽  
N. Drinkwater ◽  
C. Signer ◽  
S. Ryan ◽  
A. O‘Dovovan

Sign in / Sign up

Export Citation Format

Share Document