Full-Field Modeling Using Streamline-Based Simulation: Four Case Studies

2002 ◽  
Vol 5 (02) ◽  
pp. 126-134 ◽  
Author(s):  
R.O. Baker ◽  
F. Kuppe ◽  
S. Chugh ◽  
R. Bora ◽  
S. Stojanovic ◽  
...  

Summary Modern streamline-based reservoir simulators are able to account for actual field conditions such as 3D multiphase flow effects, reservoir heterogeneity, gravity, and changing well conditions. A streamline simulator was used to model four field cases, with approximately 400 wells and 150,000 gridblocks. History-match run times were approximately 1 CPU hour per run, with the final history matches completed in approximately 1 month per field. In all field cases, a high percentage of wells were history matched within the first two to three runs. Streamline simulation not only enables a rapid turnaround time for studies, but it also serves as a different tool in resolving each of the studied fields' unique characteristics. The primary reasons for faster history matching of permeability fields using 3D streamline technology as compared to conventional finite-difference (FD) techniques are as follows: Streamlines clearly identify which producer-injector pairs communicate strongly (flow visualization). Streamlines allow the use of a very large number of wells, thereby substantially reducing the uncertainty associated with outer-boundary conditions. Streamline flow paths indicate that idealized drainage patterns do not exist in real fields. It is therefore unrealistic to extract symmetric elements out of a full field. The speed and efficiency of the method allows the solution of fine-scale and/or full-field models with hundreds of wells. The streamline simulator honors the historical total fluid injection and production volumes exactly because there are no drawdown constraints for incompressible problems. The technology allows for easy identification of regions that require modifications to achieve a history match. Streamlines provide new flow information (i.e., well connectivity, drainage volumes, and well allocation factors) that cannot be derived from conventional simulation methods. Introduction In the past, streamline-based flow simulation was quite limited in its application to field data. Emanuel and Milliken1 showed how hybrid streamtube models were used to history match field data rapidly to arrive at both an updated geologic model and a current oil-saturation distribution for input to FD simulations. FD simulators were then used in forecast mode. Recent advances in streamline-based flow simulators have overcome many of the limitations of previous streamline and streamtube methods.2-6 Streamline-based simulators are now fully 3D and account for multiphase gravity and fluid mobility effects as well as compressibility effects. Another key improvement is that the simulator can now account for changing well conditions due to rate changes, infill drilling, producer-injector conversions, and well abandonments. With advances in streamline methods, the technique is rapidly becoming a common tool to assist in the modeling and forecasting of field cases. As this technology has matured, it is becoming available to a larger group of engineers and is no longer confined to research centers. Published case studies using streamline simulators are now appearing from a broad distribution of sources.7–12 Because of the increasing interest in this technology, our first intent in this paper is to outline a methodology for where and how streamline-based simulation fits in the reservoir engineering toolbox. Our second objective is to provide insight into why we think the method works so well in some cases. Finally, we will demonstrate the application of the technology to everyday field situations useful to mainstream exploitation or reservoir engineers, as opposed to specialized or research applications. The Streamline Simulation Method For a more detailed mathematical description of the streamline method, please refer to the Appendix and subsequent references. In brief, the streamline simulation method solves a 3D problem by decoupling it into a series of 1D problems, each one solved along a streamline. Unlike FD simulation, streamline simulation relies on transporting fluids along a dynamically changing streamline- based flow grid, as opposed to the underlying Cartesian grid. The result is that large timestep sizes can be taken without numerical instabilities, giving the streamline method a near-linear scaling in terms of CPU efficiency vs. model size.6 For very large models, streamline-based simulators can be one to two orders of magnitude faster than FD methods. The timestep size in streamline methods is not limited by a classic grid throughput (CFL) condition but by how far fluids can be transported along the current streamline grid before the streamlines need to be updated. Factors that influence this limit include nonlinear effects like mobility, gravity, and well rate changes.5 In real field displacements, historical well effects have a far greater impact on streamline-pattern changes than do mobility and gravity. Thus, the key is determining how much historical data can be upscaled without significantly impacting simulation results. For all cases considered here, 1-year timestep sizes were more than adequate to capture changes in historical data, gravity, and mobility effects. It is worth noting that upscaling historical data also would benefit run times for FD simulations. Where possible, both SL and FD methods would then require similar simulation times. However, only for very coarse grids and specific problems is it possible to take 1-year timestep sizes with FD methods. As the grid becomes finer, CFL limitations begin to dictate the timestep size, which is much smaller than is necessary to honor nonlinearities. This is why streamline methods exhibit larger speed-up factors over FD methods as the number of grid cells increases.

2009 ◽  
Vol 180 (5) ◽  
pp. 387-397 ◽  
Author(s):  
Catherine Ponsot-Jacquin ◽  
Frédéric Roggero ◽  
Guillaume Enchery

Abstract The facies-proportion calibration method is a new history-matching technique, which modifies facies proportions within a fine geological – geostatistical model until a good match of the field data is reached. The initial facies proportions in the model are usually locally constrained by well data, for example, but their spatial tendencies may be unreliable in some parts of the reservoir. The algorithm presented in this paper introduces average proportion ratios between facies groups in order to calculate new facies proportions while taking into account their initial values. It can be applied locally on specific regions or globally on the whole reservoir for stationary or non-stationary facies distributions. The proportion ratios can be manually adjusted or iteratively computed through an optimization process. The method has been successfully applied to a real field case.


Fuels ◽  
2021 ◽  
Vol 2 (3) ◽  
pp. 286-303
Author(s):  
Vuong Van Pham ◽  
Ebrahim Fathi ◽  
Fatemeh Belyadi

The success of machine learning (ML) techniques implemented in different industries heavily rely on operator expertise and domain knowledge, which is used in manually choosing an algorithm and setting up the specific algorithm parameters for a problem. Due to the manual nature of model selection and parameter tuning, it is impossible to quantify or evaluate the quality of this manual process, which in turn limits the ability to perform comparison studies between different algorithms. In this study, we propose a new hybrid approach for developing machine learning workflows to help automated algorithm selection and hyperparameter optimization. The proposed approach provides a robust, reproducible, and unbiased workflow that can be quantified and validated using different scoring metrics. We have used the most common workflows implemented in the application of artificial intelligence (AI) and ML in engineering problems including grid/random search, Bayesian search and optimization, genetic programming, and compared that with our new hybrid approach that includes the integration of Tree-based Pipeline Optimization Tool (TPOT) and Bayesian optimization. The performance of each workflow is quantified using different scoring metrics such as Pearson correlation (i.e., R2 correlation) and Mean Square Error (i.e., MSE). For this purpose, actual field data obtained from 1567 gas wells in Marcellus Shale, with 121 features from reservoir, drilling, completion, stimulation, and operation is tested using different proposed workflows. A proposed new hybrid workflow is then used to evaluate the type well used for evaluation of Marcellus shale gas production. In conclusion, our automated hybrid approach showed significant improvement in comparison to other proposed workflows using both scoring matrices. The new hybrid approach provides a practical tool that supports the automated model and hyperparameter selection, which is tested using real field data that can be implemented in solving different engineering problems using artificial intelligence and machine learning. The new hybrid model is tested in a real field and compared with conventional type wells developed by field engineers. It is found that the type well of the field is very close to P50 predictions of the field, which shows great success in the completion design of the field performed by field engineers. It also shows that the field average production could have been improved by 8% if shorter cluster spacing and higher proppant loading per cluster were used during the frac jobs.


2021 ◽  
Vol 11 (11) ◽  
pp. 5025
Author(s):  
David González-Peña ◽  
Ignacio García-Ruiz ◽  
Montserrat Díez-Mediavilla ◽  
Mª. Isabel Dieste-Velasco ◽  
Cristina Alonso-Tristán

Prediction of energy production is crucial for the design and installation of PV plants. In this study, five free and commercial software tools to predict photovoltaic energy production are evaluated: RETScreen, Solar Advisor Model (SAM), PVGIS, PVSyst, and PV*SOL. The evaluation involves a comparison of monthly and annually predicted data on energy supplied to the national grid with real field data collected from three real PV plants. All the systems, located in Castile and Leon (Spain), have three different tilting systems: fixed mounting, horizontal-axis tracking, and dual-axis tracking. The last 12 years of operating data, from 2008 to 2020, are used in the evaluation. Although the commercial software tools were easier to use and their installations could be described in detail, their results were not appreciably superior. In annual global terms, the results hid poor estimations throughout the year, where overestimations were compensated by underestimated results. This fact was reflected in the monthly results: the software yielded overestimates during the colder months, while the models showed better estimates during the warmer months. In most studies, the deviation was below 10% when the annual results were analyzed. The accuracy of the software was also reduced when the complexity of the dual-axis solar tracking systems replaced the fixed installation.


2021 ◽  
Vol 255 ◽  
pp. 106620
Author(s):  
A. Elouneg ◽  
D. Sutula ◽  
J. Chambert ◽  
A. Lejeune ◽  
S.P.A. Bordas ◽  
...  

2021 ◽  
Author(s):  
Ali Al-Turki ◽  
Obai Alnajjar ◽  
Majdi Baddourah ◽  
Babatunde Moriwawon

Abstract The algorithms and workflows have been developed to couple efficient model parameterization with stochastic, global optimization using a Multi-Objective Genetic Algorithm (MOGA) for global history matching, and coupled with an advanced workflow for streamline sensitivity-based inversion for fine-tuning. During parameterization the low-rank subsets of most influencing reservoir parameters are identified and propagated to MOGA to perform the field-level history match. Data misfits between the field historical data and simulation data are calculated with multiple realizations of reservoir models that quantify and capture reservoir uncertainty. Each generation of the optimization algorithms reduces the data misfit relative to the previous iteration. This iterative process continues until a satisfactory field-level history match is reached or there are no further improvements. The fine-tuning process of well-connectivity calibration is then performed with a streamlined sensitivity-based inversion algorithm to locally update the model to reduce well-level mismatch. In this study, an application of the proposed algorithms and workflow is demonstrated for model calibration and history matching. The synthetic reservoir model used in this study is discretized into millions of grid cells with hundreds of producer and injector wells. It is designed to generate several decades of production and injection history to evaluate and demonstrate the workflow. In field-level history matching, reservoir rock properties (e.g., permeability, fault transmissibility, etc.) are parameterized to conduct the global match of pressure and production rates. Grid Connectivity Transform (GCT) was used and assessed to parameterize the reservoir properties. In addition, the convergence rate and history match quality of MOGA was assessed during the field (global) history matching. Also, the effectiveness of the streamline-based inversion was evaluated by quantifying the additional improvement in history matching quality per well. The developed parametrization and optimization algorithms and workflows revealed the unique features of each of the algorithms for model calibration and history matching. This integrated workflow has successfully defined and carried uncertainty throughout the history matching process. Following the successful field-level history match, the well-level history matching was conducted using streamline sensitivity-based inversion, which further improved the history match quality and conditioned the model to historical production and injection data. In general, the workflow results in enhanced history match quality in a shorter turnaround time. The geological realism of the model is retained for robust prediction and development planning.


Author(s):  
Amitabh Kumar ◽  
Brian McShane ◽  
Mark McQueen

A large Oil and Gas pipeline gathering system is commonly used to transport processed oil and gas from an offshore platform to an onshore receiving facility. High reliability and integrity for continuous operation of these systems is crucial to ensure constant supply of hydrocarbon to the onshore processing facility and eventually to market. When such a system is exposed to a series of complex environmental loadings, it is often difficult to predict the response path, in-situ condition and therefore the system’s ability to withstand subsequent future loading scenarios. In order to continue to operate the pipeline after a significant environmental event, an overall approach needs to be developed to — (a) Understand the system loading and the associated integrity, (b) Develop a series of criteria staging the sequence of actions following an event that will verify the pipeline integrity and (c) Ensure that the integrity management solution is simple and easy to understand so that it can be implemented consistently. For a complex loading scenario, one of the main challenges is the ability to predict the controlling parameter(s) that drives the global integrity of these systems. In such scenarios, the presence of numerous parameters makes the technical modeling and prediction tasks arduous. To address such scenarios, first and foremost, it is crucial to understand the baseline environment data and other associated critical design input elements. If the “design environmental baseline” has transformed (due to large events e.g. storms etc.) from its original condition; it modifies the dynamics of the system. To address this problem, a thorough modeling and assessment of the in-situ condition is essential. Further, a robust calibration method is required to predict the future response path and therefore expected pipeline condition. The study further compares the planned integrity management solutions to the field data to validate the efficiency of the predicted scenarios. By the inclusion of real field-data feedback to the modeling method, balanced integrity solutions can be achieved and the ability to quantify the risks is made more practical and actionable.


1988 ◽  
Vol 120 (S146) ◽  
pp. 57-70 ◽  
Author(s):  
Vincent G. Nealis

AbstractThe effects of weather on the spruce budworm parasitoid, Apanteles fumiferanae Vier., are examined. A phenological model based on temperature-dependent rates of development and longevity is developed and validated with field data. The model is then used to explore the effects of age-specific mortality on phenological behaviour of the parasitoid and the seasonal synchrony between the parasitoid and its host over several years. The results show that the parasitoid adult ecloses well before the host reaches an age susceptible to parasitism but that the egg maturation period and the longevity of the parasitoid diminish the consequences of the apparent asynchrony. The historical data reveal that the relative phenological characteristics of A. fumiferanae and its host vary little from year to year. In the second part of the study, temperature is shown to have a strong effect on adult parasitoid activity and on the rate of oviposition.


Sign in / Sign up

Export Citation Format

Share Document