A Novel Integrated Approach to 3D Modeling and History Matching of Gas Condensate Fields with Paucity of Geological and Production Data

2021 ◽  
Author(s):  
Denys Grytsai ◽  
Petro Shtefura ◽  
Vadym Dodukh

Abstract A methodology has been developed that, in conditions of limited geological and production data, ensures the integration of petrophysical, geological, and hydrodynamic models as components of a permanent 3D model, establishing physical relationships between parameters that describe the entire system. In the proposed method, the modelling is based on the results of the interpretation of continuous shale volume and porosity curves. Based on the analysis of core data, the multi-vector physical correlations with other parameters are made. To distinguish the reservoirs and non-reservoirs, the cut-off values of shale volume are defined; to exclude tight reservoirs with no filtration, the cut-off values of porosity are set. Using the Winland R35 method the radius of the pore throat is computed, allowing dividing the reservoirs into classes. For each class of reservoirs, the permeability vs porosity dependence is determined, and the Wright-Woody-Johnson method allows deriving equations for the bound water content. A system of configured workflows has been developed and allows automating re-modelling and simplifying its history matching. This technique was successfully applied to several 3D models of gas condensate fields, which, with a significant drilling level on the areas and a long development history, are characterized by limited geological and production data. Workflows System together with the proposed approach allowed simplifying the history matching process by splitting it into several stages. At each stage, depending on the type of input data, various parameters were matched (production, reservoir and wellhead pressures, etc.). Due to cross-functional correlation of all components, the model has significantly reduced the uncertainty parameters and allowed a detailed history matching of the development history for the entire well stock. The results obtained were tested by several geological and technological measures, including drilling new wells, and showed high convergence with the forecast indicators. The proposed approach to modelling and history matching in conditions of limited geological and production data allows: – ensuring integration and correlation of petrophysical, geological, and hydrodynamic models as components of a permanent 3D model; – automating and simplifying the modelling, history matching, and updating a model; – improving the quality of parameters’ matching results.

2013 ◽  
Vol 16 (04) ◽  
pp. 412-422
Author(s):  
A.M.. M. Farid ◽  
Ahmed H. El-Banbi ◽  
A.A.. A. Abdelwaly

Summary The depletion performance of gas/condensate reservoirs is highly influenced by changes in fluid composition below the dewpoint. The long-term prediction of condensate/gas reservoir behavior is therefore difficult because of the complexity of both composition variation and two-phase-flow effects. In this paper, an integrated model was developed to simulate gas-condensate reservoir/well behavior. The model couples the compositional material balance or the generalized material-balance equations for reservoir behavior, the two-phase pseudo integral pressure for near-wellbore behavior, and outflow correlations for wellbore behavior. An optimization algorithm was also used with the integrated model so it can be used in history-matching mode to estimate original gas in place (OGIP), original oil in place (OOIP), and productivity-index (PI) parameters for gas/condensate wells. The model also can be used to predict the production performance for variable tubinghead pressure (THP) and variable production rate. The model runs fast and requires minimal input. The developed model was validated by use of different simulation cases generated with a commercial compositional reservoir simulator for a variety of reservoir and well conditions. The results show a good agreement between the simulation cases and the integrated model. After validating the integrated model against the simulated cases, the model was used to analyze production data for a rich-gas/condensate field (initial condensate/gas ratio of 180 bbl/ MMscf). THP data for four wells were used along with basic reservoir and production data to obtain original fluids in place and PIs of the wells. The estimated parameters were then used to forecast the gas and condensate production above and below the dewpoint. The model is also capable of predicting reservoir pressure, bottomhole flowing pressure, and THP and can account for completion changes when they occur.


2021 ◽  
Author(s):  
Elizabeth Ruiz ◽  
Brandon Thibodeaux ◽  
Christopher Dorion ◽  
Herman Mukisa ◽  
Majid Faskhoodi ◽  
...  

Abstract Optimized geomodeling and history matching of production data is presented by utilizing an integrated rock and fluid workflow. Facies identification is performed by use of image logs and other geological information. In addition, image logs are used to help define structural geodynamic processes that occurred in the reservoir. Methods of reservoir fluid geodynamics are used to assess the extent of fluid compositional equilibrium, especially the asphaltenes, and thereby the extent of connectivity in these facies. Geochemical determinations are shown to be consistent with measurements of compositional thermodynamic equilibrium. The ability to develop the geo-scenario of the reservoir, the coherent evolution of rock and contained fluids in the reservoir over geologic time, improves the robustness of the geomodel. In particular, the sequence of oil charge, compositional equilibrium, fault block throw, and primary biogenic gas charge are established in this middle Pliocene reservoir with implications for production, field extension,and local basin exploration. History matching of production data prove the accuracy of the geomodel; nevertheless, refinements to the geomodel and improved history matching were obtained by expanded deterministic property estimation from wireline log and other data. Theearly connection of fluid data, both thermodynamic and geochemical, with relevant facies andtheir properties determination enables a more facile method to incorporate this data into the geomodel. Logging data from future wells in the field can be imported into the geomodel allowingdeterministic optimization of this model long after production has commenced. While each reservoir is unique with its own idiosyncrasies, the workflow presented here is generally applicable to all reservoirs and always improves reservoir understanding.


Author(s):  
M. Abdelaziz ◽  
M. Elsayed

<p><strong>Abstract.</strong> Underwater photogrammetry in archaeology in Egypt is a completely new experience applied for the first time on the submerged archaeological site of the lighthouse of Alexandria situated on the eastern extremity of the ancient island of Pharos at the foot of Qaitbay Fort at a depth of 2 to 9 metres. In 2009/2010, the CEAlex launched a 3D photogrammetry data-gathering programme for the virtual reassembly of broken artefacts. In 2013 and the beginning of 2014, with the support of the Honor Frost Foundation, methods were developed and refined to acquire manual photographic data of the entire underwater site of Qaitbay using a DSLR camera, simple and low cost materials to obtain a digital surface model (DSM) of the submerged site of the lighthouse, and also to create 3D models of the objects themselves, such as statues, bases of statues and architectural elements. In this paper we present the methodology used for underwater data acquisition, data processing and modelling in order to generate a DSM of the submerged site of Alexandria’s ancient lighthouse. Until 2016, only about 7200&amp;thinsp;m<sup>2</sup> of the submerged site, which exceeds more than 13000&amp;thinsp;m<sup>2</sup>, was covered. One of our main objectives in this project is to georeference the site since this would allow for a very precise 3D model and for correcting the orientation of the site as regards the real-world space.</p>


Author(s):  
D. Einaudi ◽  
A. Spreafico ◽  
F. Chiabrando ◽  
C. Della Coletta

Abstract. Rebuilding the past of cultural heritage through digitization, archiving and visualization by means of digital technology is becoming an emerging issue to ensure the transmission of physical and digital documentation to future generations as evidence of culture, but also to enable present generation to enlarge, facilitate and cross relate data and information in new ways. In this global effort, the digital 3D documentation of no longer existing cultural heritage can be essential for the understanding of past events and nowadays, various digital techniques and tools are developing for multiple purposes.In the present research the entire workflow, starting from archive documentation collection and digitization to the 3D models metrically controlled creation and online sharing, is considered. The technical issues to obtain a detail 3D model are examined stressing limits and potentiality of 3D reconstruction of disappeared heritage and its visualization exploiting three complexes belonging to 1911 Turin World’s Fair.


Author(s):  
Ryuji Nakada ◽  
Masanori Takigawa ◽  
Tomowo Ohga ◽  
Noritsuna Fujii

Digital oblique aerial camera (hereinafter called “oblique cameras”) is an assembly of medium format digital cameras capable of shooting digital aerial photographs in five directions i.e. nadir view and oblique views (forward and backward, left and right views) simultaneously and it is used for shooting digital aerial photographs efficiently for generating 3D models in a wide area. &lt;br&gt;&lt;br&gt; For aerial photogrammetry of public survey in Japan, it is required to use large format cameras, like DMC and UltraCam series, to ensure aerial photogrammetric accuracy. &lt;br&gt;&lt;br&gt; Although oblique cameras are intended to generate 3D models, digital aerial photographs in 5 directions taken with them should not be limited to 3D model production but they may also be allowed for digital mapping and photomaps of required public survey accuracy in Japan. &lt;br&gt;&lt;br&gt; In order to verify the potency of using oblique cameras for aerial photogrammetry (simultaneous adjustment, digital mapping and photomaps), (1) a viewer was developed to interpret digital aerial photographs taken with oblique cameras, (2) digital aerial photographs were shot with an oblique camera owned by us, a Penta DigiCAM of IGI mbH, and (3) accuracy of 3D measurements was verified.


SPE Journal ◽  
2018 ◽  
Vol 23 (05) ◽  
pp. 1496-1517 ◽  
Author(s):  
Chaohui Chen ◽  
Guohua Gao ◽  
Ruijian Li ◽  
Richard Cao ◽  
Tianhong Chen ◽  
...  

Summary Although it is possible to apply traditional optimization algorithms together with the randomized-maximum-likelihood (RML) method to generate multiple conditional realizations, the computation cost is high. This paper presents a novel method to enhance the global-search capability of the distributed-Gauss-Newton (DGN) optimization method and integrates it with the RML method to generate multiple realizations conditioned to production data synchronously. RML generates samples from an approximate posterior by minimizing a large ensemble of perturbed objective functions in which the observed data and prior mean values of uncertain model parameters have been perturbed with Gaussian noise. Rather than performing these minimizations in isolation using large sets of simulations to evaluate the finite-difference approximations of the gradients used to optimize each perturbed realization, we use a concurrent implementation in which simulation results are shared among different minimization tasks whenever these results are helping to converge to the global minimum of a specific minimization task. To improve sharing of results, we relax the accuracy of the finite-difference approximations for the gradients with more widely spaced simulation results. To avoid trapping in local optima, a novel method to enhance the global-search capability of the DGN algorithm is developed and integrated seamlessly with the RML formulation. In this way, we can improve the quality of RML conditional realizations that sample the approximate posterior. The proposed work flow is first validated with a toy problem and then applied to a real-field unconventional asset. Numerical results indicate that the new method is very efficient compared with traditional methods. Hundreds of data-conditioned realizations can be generated in parallel within 20 to 40 iterations. The computational cost (central-processing-unit usage) is reduced significantly compared with the traditional RML approach. The real-field case studies involve a history-matching study to generate history-matched realizations with the proposed method and an uncertainty quantification of production forecasting using those conditioned models. All conditioned models generate production forecasts that are consistent with real-production data in both the history-matching period and the blind-test period. Therefore, the new approach can enhance the confidence level of the estimated-ultimate-recovery (EUR) assessment using production-forecasting results generated from all conditional realizations, resulting in significant business impact.


2021 ◽  
Author(s):  
Son Hoang ◽  
Tung Tran ◽  
Tan Nguyen ◽  
Tu Truong ◽  
Duy Pham ◽  
...  

Abstract This paper reports a successful case study of applying machine learning to improve the history matching process, making it easier, less time-consuming, and more accurate, by determining whether Local Grid Refinement (LGR) with transmissibility multiplier is needed to history match gas-condensate wells producing from geologically complex reservoirs as well as determining the required LGR setup to history match those gas-condensate producers. History matching Hai Thach gas-condensate production wells is extremely challenging due to the combined effect of condensate banking, sub-seismic fault network, complex reservoir distribution and connectivity, uncertain HIIP, and lack of PVT data for most reservoirs. In fact, for some wells, many trial simulation runs were conducted before it became clear that LGR with transmissibility multiplier was required to obtain good history matching. In order to minimize this time-consuming trial-and-error process, machine learning was applied in this study to analyze production data using synthetic samples generated by a very large number of compositional sector models so that the need for LGR could be identified before the history matching process begins. Furthermore, machine learning application could also determine the required LGR setup. The method helped provide better models in a much shorter time, and greatly improved the efficiency and reliability of the dynamic modeling process. More than 500 synthetic samples were generated using compositional sector models and divided into separate training and test sets. Multiple classification algorithms such as logistic regression, Gaussian Naive Bayes, Bernoulli Naive Bayes, multinomial Naive Bayes, linear discriminant analysis, support vector machine, K-nearest neighbors, and Decision Tree as well as artificial neural networks were applied to predict whether LGR was used in the sector models. The best algorithm was found to be the Decision Tree classifier, with 100% accuracy on the training set and 99% accuracy on the test set. The LGR setup (size of LGR area and range of transmissibility multiplier) was also predicted best by the Decision Tree classifier with 91% accuracy on the training set and 88% accuracy on the test set. The machine learning model was validated using actual production data and the dynamic models of history-matched wells. Finally, using the machine learning prediction on wells with poor history matching results, their dynamic models were updated and significantly improved.


Sign in / Sign up

Export Citation Format

Share Document