Reconciling Time-Lapse Seismic and Production Data Using Streamline Models: The Bay Marchand Field, Gulf of Mexico

Author(s):  
D.W. Vasco ◽  
Akhil Datta-Gupta ◽  
Zhong He ◽  
Ronald Behrens ◽  
James Rickett ◽  
...  
2000 ◽  
Author(s):  
Xuri Huang ◽  
Robert Will ◽  
Mashiur Khan ◽  
Larry Stanley

2001 ◽  
Vol 20 (3) ◽  
pp. 278-289 ◽  
Author(s):  
Xuri Huang ◽  
Robert Will ◽  
Mashiur Khan ◽  
Larry Stanley

Energies ◽  
2021 ◽  
Vol 14 (4) ◽  
pp. 1052
Author(s):  
Baozhong Wang ◽  
Jyotsna Sharma ◽  
Jianhua Chen ◽  
Patricia Persaud

Estimation of fluid saturation is an important step in dynamic reservoir characterization. Machine learning techniques have been increasingly used in recent years for reservoir saturation prediction workflows. However, most of these studies require input parameters derived from cores, petrophysical logs, or seismic data, which may not always be readily available. Additionally, very few studies incorporate the production data, which is an important reflection of the dynamic reservoir properties and also typically the most frequently and reliably measured quantity throughout the life of a field. In this research, the random forest ensemble machine learning algorithm is implemented that uses the field-wide production and injection data (both measured at the surface) as the only input parameters to predict the time-lapse oil saturation profiles at well locations. The algorithm is optimized using feature selection based on feature importance score and Pearson correlation coefficient, in combination with geophysical domain-knowledge. The workflow is demonstrated using the actual field data from a structurally complex, heterogeneous, and heavily faulted offshore reservoir. The random forest model captures the trends from three and a half years of historical field production, injection, and simulated saturation data to predict future time-lapse oil saturation profiles at four deviated well locations with over 90% R-square, less than 6% Root Mean Square Error, and less than 7% Mean Absolute Percentage Error, in each case.


2018 ◽  
Vol 15 (4) ◽  
pp. 1561-1587 ◽  
Author(s):  
Rafael Souza ◽  
David Lumley ◽  
Jeffrey Shragge ◽  
Alessandra Davolio ◽  
Denis José Schiozer

2019 ◽  
Vol 38 (10) ◽  
pp. 754-761 ◽  
Author(s):  
Liqin Sang ◽  
Uwe Klein-Helmkamp ◽  
Andrew Cook ◽  
Juan R. Jimenez

Seismic direct hydrocarbon indicators (DHIs) are routinely used in the identification of hydrocarbon reservoirs and in the positioning of drilling targets. Understanding seismic amplitude reliability and character, including amplitude variation with offset (AVO), is key to correct interpretation of the DHI and to enable confident assessment of the commercial viability of the reservoir targets. In many cases, our interpretation is impeded by limited availability of data that are often less than perfect. Here, we present a seismic quantitative interpretation (QI) workflow that made the best out of imperfect data and managed to successfully derisk a multiwell drilling campaign in the Auger and Andros basins in the deepwater Gulf of Mexico. Data challenges included azimuthal illumination effects caused by the presence of the Auger salt dome, sand thickness below tuning, and long-term production effects that are hard to quantify without dedicated time-lapse seismic. In addition, seismic vintages with varying acquisition geometries led to different QI predictions that further complicated the interpretation story. Given these challenges, we implemented an amplitude derisking workflow that combined ray-based illumination assessments and prestack data observations to guide selection of the optimal seismic data set(s) for QI analysis. This was followed by forward modeling to quantify the fluid saturation and sand thickness effects on seismic amplitude. Combined with structural geology analysis of the well targets, this workflow succeeded in significantly reducing the risk of the proposed opportunities. The work also highlighted potential pitfalls in AVO interpretation, including AVO inversion for the characterization of reservoirs near salt, while providing a workflow for prestack amplitude quality control prior to inversion. The workflow is adaptable to specific target conditions and can be executed in a time-efficient manner. It has been applied to multiple infill well opportunities, but for simplicity reasons here, we demonstrate the application on a single well target.


SPE Journal ◽  
2006 ◽  
Vol 11 (04) ◽  
pp. 418-430 ◽  
Author(s):  
Karl D. Stephen ◽  
Juan Soldo ◽  
Colin Macbeth ◽  
Mike A. Christie

Summary Time-lapse (or 4D) seismic is increasingly being used as a qualitative description of reservoir behavior for management and decision-making purposes. When combined quantitatively with geological and flow modeling as part of history matching, improved predictions of reservoir production can be obtained. Here, we apply a method of multiple-model history matching based on simultaneous comparison of spatial data offered by seismic as well as individual well-production data. Using a petroelastic transform and suitable rescaling, forward-modeled simulations are converted into predictions of seismic impedance attributes and compared to observed data by calculation of a misfit. A similar approach is applied to dynamic well data. This approach improves on gradient-based methods by avoiding entrapment in local minima. We demonstrate the method by applying it to the UKCS Schiehallion reservoir, updating the operator's model. We consider a number of parameters to be uncertain. The reservoir's net to gross is initially updated to better match the observed baseline acoustic impedance derived from the RMS amplitudes of the migrated stack. We then history match simultaneously for permeability, fault transmissibility multipliers, and the petroelastic transform parameters. Our results show a good match to the observed seismic and well data with significant improvement to the base case. Introduction Reservoir management requires tools such as simulation models to predict asset behavior. History matching is often employed to alter these models so that they compare favorably to observed well rates and pressures. This well information is obtained at discrete locations and thus lacks the areal coverage necessary to accurately constrain dynamic reservoir parameters such as permeability and the location and effect of faults. Time-lapse seismic captures the effect of pressure and saturation on seismic impedance attributes, giving 2D maps or 3D volumes of the missing information. The process of seismic history matching attempts to overlap the benefits of both types of information to improve estimates of the reservoir model parameters. We first present an automated multiple-model history-matching method that includes time-lapse seismic along with production data, based on an integrated workflow (Fig. 1). It improves on the classical approach, wherein the engineer manually adjusts parameters in the simulation model. Our method also improves on gradient-based methods, such as Steepest Descent, Gauss-Newton, and Levenberg-Marquardt algorithms (e.g., Lépine et al. 1999;Dong and Oliver 2003; Gosselin et al. 2003; Mezghani et al. 2004), which are good at finding local likelihood maxima but can fail to find the global maximum. Our method is also faster than stochastic methods such as genetic algorithms and simulated annealing, which often require more simulations and may have slower convergence rates. Finally, multiple models are generated, enabling posterior uncertainty analysis in a Bayesian framework (as in Stephen and MacBeth 2006a).


1999 ◽  
Vol 69 (2) ◽  
pp. 327-333 ◽  
Author(s):  
R.H. Bradshaw ◽  
D.M. Broom

AbstractA comparison was made of sow lying behaviour, piglet aggregation behaviour and performance in crates (no. = 10) and oval pens (no. = 8). Twenty-four hour time-lapse video tapes were made and a farrowing day defined for each sow by noting the 24-h period during which the sow gave birth (09:00 to 09:00 h). Each sow and litter, balanced for parity and time of year, was analysed from 12:00 to 20:00 h during the 24 h immediately following this day. The following analyses were conducted: (1) the number and type of lying behaviour; (2) each litter was scanned every 10 min and at each lying event the number of piglets within 0·3 m of the sow noted; two indices were then calculated, based on the mean of the 10-min scans and the mean for the lying events, for each sow expressed as a proportion of the total litter size. Any dead piglets were removed and cause of mortality established by post-mortem examination. Production data showed that there was no significant difference between litter size at birth and at weaning but overall level of mortality was higher in the pen compared with the crate due to crushing. The majority of crushing events occurred in the first 3 days after farrowing (crate 75%; oval pen 64%). The total number of lying events and related posture changes did not differ between systems; only ‘roll-over’ events (movement from lateral on one side to the other within 10 s) were higher in the oval pen. There was no difference in the proportion of aggregating piglets at the 10-min scans or the lying events. Increased crushing mortality in the pen does not appear to be due to the aggregation behaviour of piglets but to the increased number of sow roll-over behaviours.


Sign in / Sign up

Export Citation Format

Share Document