An Integrated Ensemble-Based Uncertainty Centric Approach to Address Multi-Disciplinary Reservoir Challenges While Accelerating Subsurface Modeling Process in an Onshore Field, Abu Dhabi, UAE

2021 ◽  
Author(s):  
Salahaldeen Alqallabi ◽  
Abdul Saboor Khan ◽  
Anish Phade ◽  
Mohamed Tarik Gacem ◽  
Mustapha Adli ◽  
...  

Abstract The aim of this study is to demonstrate the value of a fully integrated ensemble-based modeling approach for an onshore field in Abu Dhabi. Model uncertainties are included in both static and dynamic domains and valuable insights are achieved in record time of nine-weeks with very promising results. Workflows are established to honor the recommended static and dynamic modeling processes suited to the complexity of the field. Realistic sedimentological, structural and dynamic reservoir parameter uncertainties are identified and propagated to obtain realistic variability in the reservoir simulator response. These integrated workflows are used to generate an ensemble of equi-probable reservoir models. All realizations in the ensemble are then history-matched simultaneously before carrying out the production predictions using the entire ensemble. Analysis of the updates made during the history-matching process demonstrates valuable insights to the reservoir such as the presence of enhanced permeability streaks. These represent a challenge in the explicit modeling process due to the complex responses on the well log profiles. However, results analysis of the history matched ensemble shows that the location of high permeability updates generated by the history matching process is consistent with geological observations of enhanced permeability streaks in cores and the sequence stratigraphic framework. Additionally, post processing of available PLT data as a blind test show trends of fluid flow along horizontal wells are well captured, increasing confidence in the geologic consistency of the ensemble of models. This modeling approach provides an ensemble of history- matched reservoir models having an excellent match for both field and individual wells’ observed field production data. Furthermore, with the recommended modeling workflows, the generated models are geologically consistent and honor inherent correlations in the input data. Forecast of this ensemble of models enables realistic uncertainties in dynamic responses to be quantified, providing insights for informed reservoir management decisions and risk mitigation. Analysis of forecasted ensemble dynamic responses help evaluating performance of existing infill targets and delineate new infill targets while understanding the associated risks under both static and dynamic uncertainty. Repeatable workflows allow incorporation of new data in a robust manner and accelerates time from model building to decision making.

2021 ◽  
Author(s):  
Ali Al-Turki ◽  
Obai Alnajjar ◽  
Majdi Baddourah ◽  
Babatunde Moriwawon

Abstract The algorithms and workflows have been developed to couple efficient model parameterization with stochastic, global optimization using a Multi-Objective Genetic Algorithm (MOGA) for global history matching, and coupled with an advanced workflow for streamline sensitivity-based inversion for fine-tuning. During parameterization the low-rank subsets of most influencing reservoir parameters are identified and propagated to MOGA to perform the field-level history match. Data misfits between the field historical data and simulation data are calculated with multiple realizations of reservoir models that quantify and capture reservoir uncertainty. Each generation of the optimization algorithms reduces the data misfit relative to the previous iteration. This iterative process continues until a satisfactory field-level history match is reached or there are no further improvements. The fine-tuning process of well-connectivity calibration is then performed with a streamlined sensitivity-based inversion algorithm to locally update the model to reduce well-level mismatch. In this study, an application of the proposed algorithms and workflow is demonstrated for model calibration and history matching. The synthetic reservoir model used in this study is discretized into millions of grid cells with hundreds of producer and injector wells. It is designed to generate several decades of production and injection history to evaluate and demonstrate the workflow. In field-level history matching, reservoir rock properties (e.g., permeability, fault transmissibility, etc.) are parameterized to conduct the global match of pressure and production rates. Grid Connectivity Transform (GCT) was used and assessed to parameterize the reservoir properties. In addition, the convergence rate and history match quality of MOGA was assessed during the field (global) history matching. Also, the effectiveness of the streamline-based inversion was evaluated by quantifying the additional improvement in history matching quality per well. The developed parametrization and optimization algorithms and workflows revealed the unique features of each of the algorithms for model calibration and history matching. This integrated workflow has successfully defined and carried uncertainty throughout the history matching process. Following the successful field-level history match, the well-level history matching was conducted using streamline sensitivity-based inversion, which further improved the history match quality and conditioned the model to historical production and injection data. In general, the workflow results in enhanced history match quality in a shorter turnaround time. The geological realism of the model is retained for robust prediction and development planning.


2015 ◽  
Vol 138 (1) ◽  
Author(s):  
Jihoon Park ◽  
Jeongwoo Jin ◽  
Jonggeun Choe

For decision making, it is crucial to have proper reservoir characterization and uncertainty assessment of reservoir performances. Since initial models constructed with limited data have high uncertainty, it is essential to integrate both static and dynamic data for reliable future predictions. Uncertainty quantification is computationally demanding because it requires a lot of iterative forward simulations and optimizations in a single history matching, and multiple realizations of reservoir models should be computed. In this paper, a methodology is proposed to rapidly quantify uncertainties by combining streamline-based inversion and distance-based clustering. A distance between each reservoir model is defined as the norm of differences of generalized travel time (GTT) vectors. Then, reservoir models are grouped according to the distances and representative models are selected from each group. Inversions are performed on the representative models instead of using all models. We use generalized travel time inversion (GTTI) for the integration of dynamic data to overcome high nonlinearity and take advantage of computational efficiency. It is verified that the proposed method gathers models with both similar dynamic responses and permeability distribution. It also assesses the uncertainty of reservoir performances reliably, while reducing the amount of calculations significantly by using the representative models.


2021 ◽  
Author(s):  
Abdul Saboor Khan ◽  
Salahaldeen Alqallabi ◽  
Anish Phade ◽  
Arne Skorstad ◽  
Faisal Al-Jenaibi ◽  
...  

Abstract The aim of this study is to demonstrate the value of an integrated ensemble-based modeling approach for multiple reservoirs of varying complexity. Three different carbonate reservoirs are selected with varying challenges to showcase the flexibility of the approach to subsurface teams. Modeling uncertainties are included in both static and dynamic domains and valuable insights are attained in a short reservoir modeling cycle time. Integrated workflows are established with guidance from multi-disciplinary teams to incorporate recommended static and dynamic modeling processes in parallel to overcome the modeling challenges of the individual reservoirs. Challenges such as zonal communication, presence of baffles, high permeability streaks, communication from neighboring fields, water saturation modeling uncertainties, relative permeability with hysteresis, fluid contact depth shift etc. are considered when accounting for uncertainties. All the uncertainties in sedimentology, structure and dynamic reservoir parameters are set through common dialogue and collaboration between subsurface teams to ensure that modeling best practices are adhered to. Adaptive pluri-Gaussian simulation is used for facies modeling and uncertainties are propagated in the dynamic response of the geologically plausible ensembles. These equiprobable models are then history-matched simultaneously using an ensemble-based conditioning tool to match the available observed field production data within a specified tolerance; with each reservoir ranging in number of wells, number of grid cells and production history. This approach results in a significantly reduced modeling cycle time compared to the traditional approach, regardless of the inherent complexity of the reservoir, while giving better history-matched models that are honoring the geology and correlations in input data. These models are created with only enough detail level as per the modeling objectives, leaving more time to extract insights from the ensemble of models. Uncertainties in data, from various domains, are not isolated there, but rather propagated throughout, as these might have an important role in another domain, or in the total response uncertainty. Similarly, the approach encourages a collaborative effort in reservoir modeling and fosters trust between geo-scientists and engineers, ascertaining that models remain consistent across all subsurface domains. It allows for the flexibility to incorporate modeling practices fit for individual reservoirs. Moreover, analysis of the history-matched ensemble shows added insights to the reservoirs such as the location and possible extent of features like high permeability streaks and baffles that are not explicitly modeled in the process initially. Forecast strategies further run on these ensembles of equiprobable models, capture realistic uncertainties in dynamic responses which can help make informed reservoir management decisions. The integrated ensemble-based modeling approach is successfully applied on three different reservoir cases, with different levels of complexity. The fast-tracked process from model building to decision making enabled rapid insights for all domains involved.


SPE Journal ◽  
2006 ◽  
Vol 11 (04) ◽  
pp. 431-442 ◽  
Author(s):  
Xian-Huan Wen ◽  
Wen H. Chen

Summary The ensemble Kalman Filter technique (EnKF) has been reported to be very efficient for real-time updating of reservoir models to match the most current production data. Using EnKF, an ensemble of reservoir models assimilating the most current observations of production data is always available. Thus, the estimations of reservoir model parameters, and their associated uncertainty, as well as the forecasts are always up-to-date. In this paper, we apply the EnKF for continuously updating an ensemble of permeability models to match real-time multiphase production data. We improve the previous EnKF by adding a confirming option (i.e., the flow equations are re-solved from the previous assimilating step to the current step using the updated current permeability models). By doing so, we ensure that the updated static and dynamic parameters are always consistent with the flow equations at the current step. However, it also creates some inconsistency between the static and dynamic parameters at the previous step where the confirming starts. Nevertheless, we show that, with the confirming approach, the filter shows better performance for the particular example investigated. We also investigate the sensitivity of using a different number of realizations in the EnKF. Our results show that a relatively large number of realizations are needed to obtain stable results, particularly for the reliable assessment of uncertainty. The sensitivity of using different covariance functions is also investigated. The efficiency and robustness of the EnKF is demonstrated using an example. By assimilating more production data, new features of heterogeneity in the reservoir model can be revealed with reduced uncertainty, resulting in more accurate predictions of reservoir production. Introduction The reliability of reservoir models could increase as more data are included in their construction. Traditionally, static (hard and soft) data, such as geological, geophysical, and well log/core data are incorporated into reservoir geological models through conditional geostatistical simulation (Deutsch and Journel 1998). Dynamic production data, such as historical measurements of reservoir production, account for the majority of reservoir data collected during the production phase. These data are directly related to the recovery process and to the response variables that form the basis for reservoir management decisions. Incorporation of dynamic data is typically done through a history-matching process. Traditionally, history matching adjusts model variables (such as permeability, porosity, and transmissibility) so that the flow simulation results using the adjusted parameters match the observations. It usually requires repeated flow simulations. Both manual and (semi-) automatic history-matching processes are available in the industry (Chen et al. 1974; He et al. 1996; Landa and Horne 1997; Milliken and Emanuel 1998; Vasco et al. 1998; Wen et al. 1998a, 1998b; Roggero and Hu 1998; Agarwal and Blunt 2003; Caers 2003; Cheng et al. 2004). Automatic history matching is usually formulated in the form of a minimization problem in which the mismatch between measurements and computed values is minimized (Tarantola 1987; Sun 1994). Gradient-based methods are widely employed for such minimization problems, which require the computation of sensitivity coefficients (Li et al. 2003; Wen et al. 2003; Gao and Reynolds 2006). In the recent decade, automatic history matching has been a very active research area with significant progress reported (Cheng et al. 2004; Gao and Reynolds 2006; Wen et al. 1997). However, most approaches are either limited to small and simple reservoir models or are computationally too intensive for practical applications. Under the framework of traditional history matching, the assessment of uncertainty is usually through a repeated history-matching process with different initial models, which makes the process even more CPU-demanding. In addition, the traditional history-matching methods are not designed in such a fashion that allows for continuous model updating. When new production data are available and are required to be incorporated, the history-matching process has to be repeated using all measured data. These limit the efficiency and applicability of the traditional automatic history-matching techniques.


2021 ◽  
Author(s):  
Usman Aslam ◽  
Jorge Burgos ◽  
Craig Williams ◽  
Shawn McCloskey ◽  
James Cooper ◽  
...  

Abstract Reservoir production forecasts are inherently uncertain due to the lack of quality data available to build predictive reservoir models. Multiple data types, including historical production, well tests (RFT/PLT), and time-lapse seismic data, are assimilated into reservoir models during the history matching process to improve predictability of the model. Traditionally, a ‘best estimate’ for relative permeability data is assumed during the history matching process, despite there being significant uncertainty in the relative permeability. Relative permeability governs multiphase flow in the reservoir; therefore, it has significant importance in understanding the reservoir behavior as well as for model calibration and hence for reliable production forecasts. Performing sensitivities around the ‘best estimate’ relative permeability case will cover only part of the uncertainty space, with no indication of the confidence that may be placed on these forecasts. In this paper, we present an application of a Bayesian framework for uncertainty assessment and efficient history matching of a Permian CO2 EOR field for reliable production forecast. The study field has complex geology with over 65 years of historical data from primary recovery, waterflood, and CO2 injection. Relative permeability data from the field showed significant uncertainty, so we used uncertainties in the saturation endpoints as well as in the curvature of the relative permeability in multiple zones, by employing generalized Corey functions for relative permeability parameterization. Uncertainty in the relative permeability is used through a common platform integrator. An automated workflow generates the first set of relative permeability curves sampled from the prior distribution of saturation endpoints and Corey exponents, called ‘scoping runs’. These relative permeability curves are then passed to the reservoir simulator. The assumptions of uncertainties in the relative permeability data and other dynamic parameters are quickly validated by comparing the scoping runs and historical observations. By creating a mismatch or likelihood function, the Bayesian framework generates an ensemble of history matched models calibrated to the production data which can then be used for reliable probabilistic forecasting. Several iterations during the manual history match did not yield an acceptable solution, as uncertainty in the relative permeability was ignored. An application of the Bayesian inference accelerated by a proxy model found the relative permeability data to be one of the most influential parameters during the assisted history matching exercise. Incorporating the uncertainty in relative permeability data along with other dynamic parameters not only helped speed up the model calibration process, but also led to the identification of multiple history matched models. In addition, results show that the use of the Bayesian framework significantly reduced uncertainty in the most important dynamic parameters. The proposed approach allows incorporating previously ignored uncertainty in the relative permeability data in a systematic manner. The user-defined mismatch function increases the likelihood of obtaining an acceptable match and the weights in the mismatch function allow both the measurement uncertainty and the effect of simulation model inaccuracies. The Bayesian framework considers the whole uncertainty space and not just the history match region, leading to the identification of multiple history matched models.


Author(s):  
Beth Lyall-Wilson ◽  
Nicolas Kim ◽  
Elizabeth Hohman

This paper describes the development and new application of a text modeling process for identifying human factors topics, such as fatigue, workload, and distraction in aviation safety reports. Current approaches to identifying human factors topic representations in text data rely on manual review from subject matter experts. The implementation of a semi-supervised text modeling method overcomes the need for lengthy manual review through an initial extraction of pre-defined human factors topics, freeing time for focus on analyzing the information. This modeling approach allows analysts to use keywords to define topics of interest up front and influence the convergence of the model toward a result that reflects them, which provides an advantage over classic topic modeling approaches where domain knowledge is not integrated into the generation of derived topics. This paper includes a description of the modeling approach and rationale, data used, evaluation methods, challenges, and suggestions for future applications.


SPE Journal ◽  
2017 ◽  
Vol 22 (05) ◽  
pp. 1506-1518 ◽  
Author(s):  
Pedram Mahzari ◽  
Mehran Sohrabi

Summary Three-phase flow in porous media during water-alternating-gas (WAG) injections and the associated cycle-dependent hysteresis have been subject of studies experimentally and theoretically. In spite of attempts to develop models and simulation methods for WAG injections and three-phase flow, current lack of a solid approach to handle hysteresis effects in simulating WAG-injection scenarios has resulted in misinterpretations of simulation outcomes in laboratory and field scales. In this work, by use of our improved methodology, the first cycle of the WAG experiments (first waterflood and the subsequent gasflood) was history matched to estimate the two-phase krs (oil/water and gas/oil). For subsequent cycles, pertinent parameters of the WAG hysteresis model are included in the automatic-history-matching process to reproduce all WAG cycles together. The results indicate that history matching the whole WAG experiment would lead to a significantly improved simulation outcome, which highlights the importance of two elements in evaluating WAG experiments: inclusion of the full WAG experiments in history matching and use of a more-representative set of two-phase krs, which was originated from our new methodology to estimate two-phase krs from the first cycle of a WAG experiment. Because WAG-related parameters should be able to model any three-phase flow irrespective of WAG scenarios, in another exercise, the tuned parameters obtained from a WAG experiment (starting with water) were used in a similar coreflood test (WAG starting with gas) to assess predictive capability for simulating three-phase flow in porous media. After identifying shortcomings of existing models, an improved methodology was used to history match multiple coreflood experiments simultaneously to estimate parameters that can reasonably capture processes taking place in WAG at different scenarios—that is, starting with water or gas. The comprehensive simulation study performed here would shed some light on a consolidated methodology to estimate saturation functions that can simulate WAG injections at different scenarios.


2005 ◽  
Author(s):  
Paul Thomas ◽  
Mickaele Le Ravalec-Dupin ◽  
Frederic Roggero

2021 ◽  
Author(s):  
Victor de Souza Rios ◽  
Arne Skauge ◽  
Ken Sorbie ◽  
Gang Wang ◽  
Denis José Schiozer ◽  
...  

Abstract Compositional reservoir simulation is essential to represent the complex interactions associated with gas flooding processes. Generally, an improved description of such small-scale phenomena requires the use of very detailed reservoir models, which impact the computational cost. We provide a practical and general upscaling procedure to guide a robust selection of the upscaling approaches considering the nature and limitations of each reservoir model, exploring the differences between the upscaling of immiscible and miscible gas injection problems. We highlight the different challenges to achieve improved upscaled models for immiscible and miscible gas displacement conditions with a stepwise workflow. We first identify the need for a special permeability upscaling technique to improve the representation of the main reservoir heterogeneities and sub-grid features, smoothed during the upscaling process. Then, we verify if the use of pseudo-functions is necessary to correct the multiphase flow dynamic behavior. At this stage, different pseudoization approaches are recommended according to the miscibility conditions of the problem. This study evaluates highly heterogeneous reservoir models submitted to immiscible and miscible gas flooding. The fine models represent a small part of a reservoir with a highly refined set of grid-block cells, with 5 × 5 cm2 area. The upscaled coarse models present grid-block cells of 8 × 10 m2 area, which is compatible with a refined geological model in reservoir engineering studies. This process results in a challenging upscaling ratio of 32 000. We show a consistent procedure to achieve reliable results with the coarse-scale model under the different miscibility conditions. For immiscible displacement situations, accurate results can be obtained with the coarse models after a proper permeability upscaling procedure and the use of pseudo-relative permeability curves to improve the dynamic responses. Miscible displacements, however, requires a specific treatment of the fluid modeling process to overcome the limitations arising from the thermodynamic equilibrium assumption. For all the situations, the workflow can lead to a robust choice of techniques to satisfactorily improve the coarse-scale simulation results. Our approach works on two fronts. (1) We apply a dual-porosity/dual-permeability upscaling process, developed by Rios et al. (2020a), to enable the representation of sub-grid heterogeneities in the coarse-scale model, providing consistent improvements on the upscaling results. (2) We generate specific pseudo-functions according to the miscibility conditions of the gas flooding process. We developed a stepwise procedure to deal with the upscaling problems consistently and to enable a better understanding of the coarsening process.


2021 ◽  
Vol 73 (04) ◽  
pp. 60-61
Author(s):  
Chris Carpenter

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17–19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17-19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction The concept of rate transient analysis (RTA) involves the use of rate and pressure trends of producing wells to estimate properties such as permeability and fracture surface area. While very useful, RTA is an analytical technique and has commensurate limitations. In the complete paper, different RTA motifs are generated using a simulator. Insights from these motif simulations are used to modify simulation parameters to expediate and inform the history- matching process. The simulation history-matching work flow presented includes the following steps: 1 - Set up a simulation model with geologic properties, wellbore and completion designs, and fracturing and production schedules 2 - Run an initial model 3 - Tune the fracture geometries (height and length) to heuristic data: microseismic, frac-hit data, distributed acoustic sensing, or other diagnostics 4 - Match instantaneous shut-in pressure (ISIP) and wellhead pressure (WHP) during injection 5 - Make RTA plots of the real and simulated production data 6 - Use the motifs presented in the paper to identify possible production mechanisms in the real data 7 - Adjust history-matching parameters in the simulation model based on the intuition gained from RTA of the real data 8 -Iterate Steps 5 through 7 to obtain a match in RTA trends 9 - Modify relative permeabilities as necessary to obtain correct oil, water, and gas proportions In this study, the authors used a commercial simulator that fully integrates hydraulic fracturing, wellbore, and reservoir simulation into a single modeling code. Matching Fracturing Data The complete paper focuses on matching production data, assisted by RTA, not specifically on the matching of fracturing data such as injection pressure and fracture geometry (Steps 3 and 4). Nevertheless, for completeness, these steps are very briefly summarized in this section. Effective fracture toughness is the most-important factor in determining fracture length. Field diagnostics suggest considerable variability in effective fracture toughness and fracture length. Typical half-lengths are between 500 and 2,000 ft. Laboratory-derived values of fracture toughness yield longer fractures (propagation of 2,000 ft or more from the wellbore). Significantly larger values of fracture toughness are needed to explain the shorter fracture length and higher net pressure values that are often observed. The authors use a scale- dependent fracture-toughness parameter to increase toughness as the fracture grows. This allows the simulator to match injection pressure data while simultaneously limiting fracture length. This scale-dependent toughness scaling parameter is the most-important parameter in determining fracture size.


Sign in / Sign up

Export Citation Format

Share Document