A Methodology To Quantify the Impact of Uncertainties in the History-Matching Process and in the Production Forecast

Author(s):  
Celio Maschio ◽  
Denis Jose Schiozer ◽  
Marcos Antonio Antonio Bezerra de Moura Filho
2021 ◽  
Author(s):  
Oleksandr Doroshenko ◽  
Miljenko Cimic ◽  
Nicholas Singh ◽  
Yevhen Machuzhak

Abstract A fully integrated production model (IPM) has been implemented in the Sakhalin field to optimize hydrocarbons production and carried out effective field development. To achieve our goal in optimizing production, a strategy has been accurately executed to align the surface facilities upgrade with the production forecast. The main challenges to achieving the goal, that we have faced were:All facilities were designed for early production stage in late 1980's, and as the asset outdated the pipeline sizes, routing and compression strategies needs review.Detecting, predicting and reducing liquid loading is required so that the operator can proactively control the hydrocarbon production process.No integrated asset model exists to date. The most significant engineering tasks were solved by creating models of reservoirs, wells and surface network facility, and after history matching and connecting all the elements of the model into a single environment, it has been used for the different production forecast scenarios, taking into account the impact of infrastructure bottlenecks on production of each well. This paper describes in detail methodology applied to calculate optimal well control, wellhead pressure, pressure at the inlet of the booster compressor, as well as for improving surface flowlines capacity. Using the model, we determined the compressor capacity required for the next more than ten years and assessed the impact of pipeline upgrades on oil gas and condensate production. Using optimization algorithms, a realistic scenario was set and used as a basis for maximizing hydrocarbon production. Integrated production model (IPM) and production optimization provided to us several development scenarios to achieve target production at the lowest cost by eliminating infrastructure constraints.


2021 ◽  
pp. 074171362110190
Author(s):  
Fabian Rüter ◽  
Andreas Martin

Participation in adult learning and education requires the availability of, and accessibility to, learning opportunities provided by educational institutions. One fundamental element is time. Adult learning and education participation can only be realized by successfully matching individual time-availabilities with the temporal organization of provided courses. To address this required matching process, this study contributes to research literature as one of the first studies that investigates the impact of timing and course duration on participation counts (longitudinally). For this, we use organizational data from public adult education centers ( Volkshochschulen—VHS; the main adult education providers in Germany) from 2007 to 2017. Methodologically, random- and fixed-effects models are applied. We find significant positive effects on participation counts between increasing program breadth in terms of temporal formats and increasing average course duration.


SPE Journal ◽  
2017 ◽  
Vol 22 (05) ◽  
pp. 1506-1518 ◽  
Author(s):  
Pedram Mahzari ◽  
Mehran Sohrabi

Summary Three-phase flow in porous media during water-alternating-gas (WAG) injections and the associated cycle-dependent hysteresis have been subject of studies experimentally and theoretically. In spite of attempts to develop models and simulation methods for WAG injections and three-phase flow, current lack of a solid approach to handle hysteresis effects in simulating WAG-injection scenarios has resulted in misinterpretations of simulation outcomes in laboratory and field scales. In this work, by use of our improved methodology, the first cycle of the WAG experiments (first waterflood and the subsequent gasflood) was history matched to estimate the two-phase krs (oil/water and gas/oil). For subsequent cycles, pertinent parameters of the WAG hysteresis model are included in the automatic-history-matching process to reproduce all WAG cycles together. The results indicate that history matching the whole WAG experiment would lead to a significantly improved simulation outcome, which highlights the importance of two elements in evaluating WAG experiments: inclusion of the full WAG experiments in history matching and use of a more-representative set of two-phase krs, which was originated from our new methodology to estimate two-phase krs from the first cycle of a WAG experiment. Because WAG-related parameters should be able to model any three-phase flow irrespective of WAG scenarios, in another exercise, the tuned parameters obtained from a WAG experiment (starting with water) were used in a similar coreflood test (WAG starting with gas) to assess predictive capability for simulating three-phase flow in porous media. After identifying shortcomings of existing models, an improved methodology was used to history match multiple coreflood experiments simultaneously to estimate parameters that can reasonably capture processes taking place in WAG at different scenarios—that is, starting with water or gas. The comprehensive simulation study performed here would shed some light on a consolidated methodology to estimate saturation functions that can simulate WAG injections at different scenarios.


2021 ◽  
Vol 73 (04) ◽  
pp. 60-61
Author(s):  
Chris Carpenter

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17–19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 199149, “Rate-Transient-Analysis-Assisted History Matching With a Combined Hydraulic Fracturing and Reservoir Simulator,” by Garrett Fowler, SPE, and Mark McClure, SPE, ResFrac, and Jeff Allen, Recoil Resources, prepared for the 2020 SPE Latin American and Caribbean Petroleum Engineering Conference, originally scheduled to be held in Bogota, Colombia, 17-19 March. The paper has not been peer reviewed. This paper presents a step-by-step work flow to facilitate history matching numerical simulation models of hydraulically fractured shale wells. Sensitivity analysis simulations are performed with a coupled hydraulic fracturing, geomechanics, and reservoir simulator. The results are used to develop what the authors term “motifs” that inform the history-matching process. Using intuition from these simulations, history matching can be expedited by changing matrix permeability, fracture conductivity, matrix-pressure-dependent permeability, boundary effects, and relative permeability. Introduction The concept of rate transient analysis (RTA) involves the use of rate and pressure trends of producing wells to estimate properties such as permeability and fracture surface area. While very useful, RTA is an analytical technique and has commensurate limitations. In the complete paper, different RTA motifs are generated using a simulator. Insights from these motif simulations are used to modify simulation parameters to expediate and inform the history- matching process. The simulation history-matching work flow presented includes the following steps: 1 - Set up a simulation model with geologic properties, wellbore and completion designs, and fracturing and production schedules 2 - Run an initial model 3 - Tune the fracture geometries (height and length) to heuristic data: microseismic, frac-hit data, distributed acoustic sensing, or other diagnostics 4 - Match instantaneous shut-in pressure (ISIP) and wellhead pressure (WHP) during injection 5 - Make RTA plots of the real and simulated production data 6 - Use the motifs presented in the paper to identify possible production mechanisms in the real data 7 - Adjust history-matching parameters in the simulation model based on the intuition gained from RTA of the real data 8 -Iterate Steps 5 through 7 to obtain a match in RTA trends 9 - Modify relative permeabilities as necessary to obtain correct oil, water, and gas proportions In this study, the authors used a commercial simulator that fully integrates hydraulic fracturing, wellbore, and reservoir simulation into a single modeling code. Matching Fracturing Data The complete paper focuses on matching production data, assisted by RTA, not specifically on the matching of fracturing data such as injection pressure and fracture geometry (Steps 3 and 4). Nevertheless, for completeness, these steps are very briefly summarized in this section. Effective fracture toughness is the most-important factor in determining fracture length. Field diagnostics suggest considerable variability in effective fracture toughness and fracture length. Typical half-lengths are between 500 and 2,000 ft. Laboratory-derived values of fracture toughness yield longer fractures (propagation of 2,000 ft or more from the wellbore). Significantly larger values of fracture toughness are needed to explain the shorter fracture length and higher net pressure values that are often observed. The authors use a scale- dependent fracture-toughness parameter to increase toughness as the fracture grows. This allows the simulator to match injection pressure data while simultaneously limiting fracture length. This scale-dependent toughness scaling parameter is the most-important parameter in determining fracture size.


2021 ◽  
Author(s):  
Boxiao Li ◽  
Hemant Phale ◽  
Yanfen Zhang ◽  
Timothy Tokar ◽  
Xian-Huan Wen

Abstract Design of Experiments (DoE) is one of the most commonly employed techniques in the petroleum industry for Assisted History Matching (AHM) and uncertainty analysis of reservoir production forecasts. Although conceptually straightforward, DoE is often misused by practitioners because many of its statistical and modeling principles are not carefully followed. Our earlier paper (Li et al. 2019) detailed the best practices in DoE-based AHM for brownfields. However, to our best knowledge, there is a lack of studies that summarize the common caveats and pitfalls in DoE-based production forecast uncertainty analysis for greenfields and history-matched brownfields. Our objective here is to summarize these caveats and pitfalls to help practitioners apply the correct principles for DoE-based production forecast uncertainty analysis. Over 60 common pitfalls in all stages of a DoE workflow are summarized. Special attention is paid to the following critical project transitions: (1) the transition from static earth modeling to dynamic reservoir simulation; (2) from AHM to production forecast; and (3) from analyzing subsurface uncertainties to analyzing field-development alternatives. Most pitfalls can be avoided by consistently following the statistical and modeling principles. Some pitfalls, however, can trap experienced engineers. For example, mistakes made in handling the three abovementioned transitions can yield strongly unreliable proxy and sensitivity analysis. For the representative examples we study, they can lead to having a proxy R2 of less than 0.2 versus larger than 0.9 if done correctly. Two improved experimental designs are created to resolve this challenge. Besides the technical pitfalls that are avoidable via robust statistical workflows, we also highlight the often more severe non-technical pitfalls that cannot be evaluated by measures like R2. Thoughts are shared on how they can be avoided, especially during project framing and the three critical transition scenarios.


2021 ◽  
Author(s):  
Ali Al-Turki ◽  
Obai Alnajjar ◽  
Majdi Baddourah ◽  
Babatunde Moriwawon

Abstract The algorithms and workflows have been developed to couple efficient model parameterization with stochastic, global optimization using a Multi-Objective Genetic Algorithm (MOGA) for global history matching, and coupled with an advanced workflow for streamline sensitivity-based inversion for fine-tuning. During parameterization the low-rank subsets of most influencing reservoir parameters are identified and propagated to MOGA to perform the field-level history match. Data misfits between the field historical data and simulation data are calculated with multiple realizations of reservoir models that quantify and capture reservoir uncertainty. Each generation of the optimization algorithms reduces the data misfit relative to the previous iteration. This iterative process continues until a satisfactory field-level history match is reached or there are no further improvements. The fine-tuning process of well-connectivity calibration is then performed with a streamlined sensitivity-based inversion algorithm to locally update the model to reduce well-level mismatch. In this study, an application of the proposed algorithms and workflow is demonstrated for model calibration and history matching. The synthetic reservoir model used in this study is discretized into millions of grid cells with hundreds of producer and injector wells. It is designed to generate several decades of production and injection history to evaluate and demonstrate the workflow. In field-level history matching, reservoir rock properties (e.g., permeability, fault transmissibility, etc.) are parameterized to conduct the global match of pressure and production rates. Grid Connectivity Transform (GCT) was used and assessed to parameterize the reservoir properties. In addition, the convergence rate and history match quality of MOGA was assessed during the field (global) history matching. Also, the effectiveness of the streamline-based inversion was evaluated by quantifying the additional improvement in history matching quality per well. The developed parametrization and optimization algorithms and workflows revealed the unique features of each of the algorithms for model calibration and history matching. This integrated workflow has successfully defined and carried uncertainty throughout the history matching process. Following the successful field-level history match, the well-level history matching was conducted using streamline sensitivity-based inversion, which further improved the history match quality and conditioned the model to historical production and injection data. In general, the workflow results in enhanced history match quality in a shorter turnaround time. The geological realism of the model is retained for robust prediction and development planning.


2021 ◽  
Author(s):  
Son Hoang ◽  
Tung Tran ◽  
Tan Nguyen ◽  
Tu Truong ◽  
Duy Pham ◽  
...  

Abstract This paper reports a successful case study of applying machine learning to improve the history matching process, making it easier, less time-consuming, and more accurate, by determining whether Local Grid Refinement (LGR) with transmissibility multiplier is needed to history match gas-condensate wells producing from geologically complex reservoirs as well as determining the required LGR setup to history match those gas-condensate producers. History matching Hai Thach gas-condensate production wells is extremely challenging due to the combined effect of condensate banking, sub-seismic fault network, complex reservoir distribution and connectivity, uncertain HIIP, and lack of PVT data for most reservoirs. In fact, for some wells, many trial simulation runs were conducted before it became clear that LGR with transmissibility multiplier was required to obtain good history matching. In order to minimize this time-consuming trial-and-error process, machine learning was applied in this study to analyze production data using synthetic samples generated by a very large number of compositional sector models so that the need for LGR could be identified before the history matching process begins. Furthermore, machine learning application could also determine the required LGR setup. The method helped provide better models in a much shorter time, and greatly improved the efficiency and reliability of the dynamic modeling process. More than 500 synthetic samples were generated using compositional sector models and divided into separate training and test sets. Multiple classification algorithms such as logistic regression, Gaussian Naive Bayes, Bernoulli Naive Bayes, multinomial Naive Bayes, linear discriminant analysis, support vector machine, K-nearest neighbors, and Decision Tree as well as artificial neural networks were applied to predict whether LGR was used in the sector models. The best algorithm was found to be the Decision Tree classifier, with 100% accuracy on the training set and 99% accuracy on the test set. The LGR setup (size of LGR area and range of transmissibility multiplier) was also predicted best by the Decision Tree classifier with 91% accuracy on the training set and 88% accuracy on the test set. The machine learning model was validated using actual production data and the dynamic models of history-matched wells. Finally, using the machine learning prediction on wells with poor history matching results, their dynamic models were updated and significantly improved.


2019 ◽  
Vol 18 (3) ◽  
pp. 456-482
Author(s):  
Laurie Krigman ◽  
Mia L. Rivolta

Purpose This paper aims to investigate the roles of non-CEO inside directors (NCIDs) in the new CEO-firm matching process using the context of unplanned CEO departures when immediate CEO succession planning becomes a sole board responsibility. Although critics argue that inside directors decrease the monitoring effectiveness of a board, inside directors arguably possess superior firm-specific experience and knowledge that can be beneficial during the leadership transition. Design/methodology/approach The authors use a comprehensive, manually collected data set of unplanned CEO departures from 1993 to 2012. Findings The authors find that NCIDs play an important role in the CEO transitioning process. They help firms identify qualified inside replacements and provide stability as the new permanent or interim CEO. In addition, NCIDs facilitate the transfer of information and help the new external CEOs succeed. They show that the longer the NCID stays with the company, the longer the tenure of the new CEO. They also document that the presence of NCIDs improves operating and stock performance; especially when the new CEO is hired from outside of the firm. Practical implications The impact of NCIDs is particularly important when the firm hires an outsider as the new CEO. These results suggest that board composition affects frictions in the CEO labor market. Originality/value The literature has predominantly focused on the downside of having inside directors. Too many inside directors on a firm’s board is often associated with ineffective boards and entrenchment. To the contrary, the authors focus on a potential benefit of having inside directors.


2010 ◽  
Author(s):  
Flavio Dickstein ◽  
Paulo Goldfeld ◽  
Gustavo Pfeiffer ◽  
Elisa Amorim ◽  
Rodrigo dos Santos ◽  
...  

SPE Journal ◽  
2013 ◽  
Vol 19 (04) ◽  
pp. 621-635 ◽  
Author(s):  
Cheng Dai ◽  
Heng Li ◽  
Dongxiao Zhang

Summary Reservoir simulations involve a large number of formation and fluid parameters, many of which are subject to uncertainties owing to the combination of spatial heterogeneity and insufficient measurements. Accurately quantifying the impact of varying parameters on simulation models can reveal the importance of the parameters, which helps in designing field-characterization strategies and determining parameterization for history matching. Compared with the commonly used local sensitivity analysis (SA), global SA considers the whole variation range of the parameters and can thus provide more-complete information. However, the traditional global sensitivity analysis that is derived from Monte Carlo simulation (MCS) is computationally too demanding for reservoir simulations. In this study, we propose an alternative approach that is both accurate and efficient. In the proposed approach, the model outputs such as pressure and reservoir production quantities are expressed by polynomial chaos expansions (PCEs). The probabilistic collocation method is used to determine the coefficients of the polynomial expansions by solving outputs at different sets of collocation points by means of the original partial-differential equations. Then, a proxy is constructed with such coefficients. Accurate statistical sensitivity indices of the uncertainty parameters can be obtained by running the proxy. We validate the approach with 2D examples by comparing with the MCS-based global SA. It is found that with only a small fraction of the computational cost required by the MCS approach, the new approach gives accurate global sensitivity for each parameter. The proposed approach is also demonstrated on a large-scale 3D black-oil model, for which the MCS-based global SA is found to be computationally infeasible. It is found that the developed approach possesses the following key advantages: It requires a much smaller number of reservoir simulations for accurate global SA; it is nonintrusive and can be implemented with existing codes or simulators; and it can accommodate arbitrary distributions of parameters encountered in realistic geological situations.


Sign in / Sign up

Export Citation Format

Share Document