scholarly journals Uncertainty Analysis of Two Methods in Hydrocarbon Prediction under Different Water Saturation and Noise Conditions

2019 ◽  
Vol 9 (23) ◽  
pp. 5239
Author(s):  
Changcheng Liu ◽  
Deva Ghosh ◽  
Ahmed Mohamed Ahmed Salim

The uncertainty of two recently proposed methods, “new fluid factor” and “delta K”, is analyzed under different water saturation and noise conditions through Monte Carlo modelling. The new fluid factor performs reliably (all metric parameters are above 0.9) when the water saturation is up to 95%. The delta K has better performance (all metric parameters are close to 1) such that it is able to distinguish hydrocarbon from brine without the interference of high water saturation. The results prove the performances of the two methods are stable in a high water-saturation scenario. The analysis of noise indicates the methods are sensitive to noise in the input data in that the performance is excellent when the noise is relatively low (−20 dB) and decreases with increasing noise energy. The new fluid factor, which is in the interface domain, is more sensitive than delta K in the impedance domain. The metric parameters of the new fluid factor and delta K are in the range of 0.5 to 0.8 when the noise is high (−7 dB). High-quality input data and integration with other geophysical methods can effectively reduce these risks. In addition, two widely used traditional methods (fluid factor and Lambda-Rho) are analyzed as comparisons. It turns out the new fluid factor and delta K have better performance than traditional methods in both high water saturation and noise conditions.

2013 ◽  
Vol 59 (2) ◽  
pp. 25-39 ◽  
Author(s):  
Ivan Mudron ◽  
Michal Podhoranyi ◽  
Juraj Cirbus ◽  
Branislav Devečka ◽  
Ladislav Bakay

Abstract This paper summarizes the methods and results of error modelling and propagation analyses in the Olše and Stonávka confluence area. In terrain analyses, the outputs of the aforementioned analysis are always a function of input. Two approaches according to the input data were used to generate field elevation errors which subsequently entered the error propagation analysis. The main goal solved in this research was to show the importance of input data in slope estimation and to estimate the elevation error propagation as well as to identify DEM errors and their consequences. Dependencies were investigated as well to achieve a better prediction of slope errors. Four different digital elevation model (DEM) resolutions (0.5, 1, 5 and 10 meters) were examined with the Root Mean Square Error (RMSE) rating up to 0.317 meters (10 m DEM). They all originated from a LIDAR survey. In the analyses, a stochastic Monte Carlo simulation was performed with 250 iterations. The article focuses on the error propagation in a large-scale area using high quality input DEM and Monte Carlo methods. The DEM uncertainty (RMSE) was obtained by sampling and ground research (RTK GPS) and from subtraction of two DEMs. According to empirical error distribution a semivariogram was used to model spatially autocorrelated uncertainty in elevation. The second procedure modelled the uncertainty without autocorrelation using a random N(0,RMSE) error generator. Statistical summaries were drawn to investigate the expected hypothesis. As expected, the error in slopes increases with the increasing vertical error in the input DEM. According to similar studies the use of different DEM input data, high quality LIDAR input data decreases the output uncertainty. Errors modelled without spatial autocorrelation do not result in a greater variance in the resulting slope error. In this case, although the slope error results (comparing random uncorrelated and empirical autocorrelated error fields) did not show any statistical significant difference, the input elevation error pattern was not normally distributed and therefore the random error generator realization is not a suitable interpretation of the true state of elevation errors. The normal distribution was rejected because of the high kurtosis and extreme values (outliners). On the other hand, it can show an important insight into the expected elevation and slope errors. Geology does not influence the slope error in the study area.


1985 ◽  
Vol 25 (06) ◽  
pp. 945-953 ◽  
Author(s):  
Mark A. Miller ◽  
H.J. Ramey

Abstract Over the past 20 years, a number of studies have reported temperature effects on two-phase relative permeabilities in porous media. Some of the reported results, however, have been contradictory. Also, observed effects have not been explained in terms of fundamental properties known to govern two-phase flow. The purpose of this study was to attempt to isolate the fundamental properties affecting two-phase relative permeabilities at elevated temperatures. Laboratory dynamic-displacement relative permeability measurements were made on unconsolidated and consolidated sand cores with water and a refined white mineral oil. Experiments were run on 2-in. [5.1-cm] -diameter, 20-in. [52.-cm] -long cores from room temperature to 300F [149C]. Unlike previous researchers, we observed essentially no changes with temperature in either residual saturations or relative permeability relationships. We concluded that previous results may have been affected by viscous previous results may have been affected by viscous instabilities, capillary end effects, and/or difficulties in maintaining material balances. Introduction Interest in measuring relative permeabilities at elevated temperatures began in the 1960's with petroleum industry interest in thermal oil recovery. Early thermal oil recovery field operations (well heaters, steam injection, in-situ combustion) indicated oil flow rate increases far in excess of what was predicted by viscosity reductions resulting from heating. This suggested that temperature affects relative permeabilities. One of the early studies of temperature effects on relative permeabilities was presented by Edmondson, who performed dynamic displacement measurements with crude performed dynamic displacement measurements with crude and white oils and distilled water in Berea sandstone cores. Edmondson reported that residual oil saturations (ROS's) (at the end of 10 PV's of water injected) decreased with increasing temperature. Relative permeability ratios decreased with temperature at high water saturations but increased with temperature at low water saturations. A series of elevated-temperature, dynamic-displacement relative permeability measurements on clean quartz and "natural" unconsolidated sands were reported by Poston et al. Like Edmondson, Poston et al. reported a decrease in the "practical" ROS (at less than 1 % oil cut) as temperature increased. Poston et al. also reported an increase in irreducible water saturation. Although irreducible water saturations decreased with decreasing temperature, they did not revert to the original room temperature values. It was assumed that the cores became increasingly water-wet with an increase in both temperature and time; measured changes of the IFT and the contact angle with temperature increase, however, were not sufficient to explain observed effects. Davidson measured dynamic-displacement relative permeability ratios on a coarse sand and gravel core with permeability ratios on a coarse sand and gravel core with white oil displaced by distilled water, nitrogen, and superheated steam at temperatures up to 540F [282C]. Starting from irreducible water saturation, relative permeability ratio curves were similar to Edmondson's. permeability ratio curves were similar to Edmondson's. Starting from 100% oil saturation, however, the curves changed significantly only at low water saturations. A troublesome aspect of Davidson's work was that he used a hydrocarbon solvent to clean the core between experiments. No mention was made of any consideration of wettability changes, which could explain large increases in irreducible water saturations observed in some runs. Sinnokrot et al. followed Poston et al.'s suggestion of increasing water-wetness and performed water/oil capillary pressure measurements on consolidated sandstone and limestone cores from room temperature up to 325F [163C]. Sinnokrot et al confirmed that, for sandstones, irreducible water saturation appeared to increase with temperature. Capillary pressures increased with temperature, and the hysteresis between drainage and imbibition curves reduced to essentially zero at 300F [149C]. With limestone cores, however, irreducible water saturations remained constant with increase in temperature, as did capillary pressure curves. Weinbrandt et al. performed dynamic displacement experiments on small (0.24 to 0.49 cu in. [4 to 8 cm3] PV) consolidated Boise sandstone cores to 175F [75C] PV) consolidated Boise sandstone cores to 175F [75C] with distilled water and white oil. Oil relative permeabilities shifted toward high water saturations with permeabilities shifted toward high water saturations with increasing temperature, while water relative permeabilities exhibited little change. Weinbrandt et al. confirmed the findings of previous studies that irreducible water saturation increases and ROS decreases with increasing temperature. SPEJ P. 945


2004 ◽  
Vol 336 (6) ◽  
pp. 553-560 ◽  
Author(s):  
Vincent Chaplot ◽  
Christian Walter ◽  
Pierre Curmi ◽  
Alain Hollier-Larousse ◽  
Henri Robain

2005 ◽  
Vol 42 (1) ◽  
pp. 110-120 ◽  
Author(s):  
M A Shahin ◽  
M B Jaksa ◽  
H R Maier

Traditional methods of settlement prediction of shallow foundations on granular soils are far from accurate and consistent. This can be attributed to the fact that the problem of estimating the settlement of shallow foundations on granular soils is very complex and not yet entirely understood. Recently, artificial neural networks (ANNs) have been shown to outperform the most commonly used traditional methods for predicting the settlement of shallow foundations on granular soils. However, despite the relative advantage of the ANN based approach, it does not take into account the uncertainty that may affect the magnitude of the predicted settlement. Artificial neural networks, like more traditional methods of settlement prediction, are based on deterministic approaches that ignore this uncertainty and thus provide single values of settlement with no indication of the level of risk associated with these values. An alternative stochastic approach is essential to provide more rational estimation of settlement. In this paper, the likely distribution of predicted settlements, given the uncertainties associated with settlement prediction, is obtained by combining Monte Carlo simulation with a deterministic ANN model. A set of stochastic design charts, which incorporate the uncertainty associated with the ANN method, is developed. The charts are considered to be useful in the sense that they enable the designer to make informed decisions regarding the level of risk associated with predicted settlements and consequently provide a more realistic indication of what the actual settlement might be.Key words: settlement prediction, shallow foundations, neural networks, Monte Carlo, stochastic simulation.


2021 ◽  
Author(s):  
Nasser Faisal Al-Khalifa ◽  
Mohammed Farouk Hassan ◽  
Deepak Joshi ◽  
Asheshwar Tiwary ◽  
Ihsan Taufik Pasaribu ◽  
...  

Abstract The Umm Gudair (UG) Field is a carbonate reservoir of West Kuwait with more than 57 years of production history. The average water cut of the field reached closed to 60 percent due to a long history of production and regulating drawdown in a different part of the field, consequentially undulating the current oil/water contact (COWC). As a result, there is high uncertainty of the current oil/water contact (COWC) that impacts the drilling strategy in the field. The typical approach used to develop the field in the lower part of carbonate is to drill deviated wells to original oil/water contact (OOWC) to know the saturation profile and later cement back up to above the high-water saturation zone and then perforate with standoff. This method has not shown encouraging results, and a high water cut presence remains. An innovative solution is required with a technology that can give a proactive approach while drilling to indicate approaching current oil/water contact and geo-stop drilling to give optimal standoff between the bit and the detected water contact (COWC). Recent development of electromagnetic (EM) look-ahead resistivity technology was considered and first implemented in the Umm Gudair (UG) Field. It is an electromagnetic-based signal that can detect the resistivity features ahead of the bit while drilling and enables proactive decisions to reduce drilling and geological or reservoir risks related to the well placement challenges.


2020 ◽  
Vol 24 (8) ◽  
pp. 4061-4090 ◽  
Author(s):  
Silvia Terzago ◽  
Valentina Andreoli ◽  
Gabriele Arduini ◽  
Gianpaolo Balsamo ◽  
Lorenzo Campo ◽  
...  

Abstract. Snow models are usually evaluated at sites providing high-quality meteorological data, so that the uncertainty in the meteorological input data can be neglected when assessing model performances. However, high-quality input data are rarely available in mountain areas and, in practical applications, the meteorological forcing used to drive snow models is typically derived from spatial interpolation of the available in situ data or from reanalyses, whose accuracy can be considerably lower. In order to fully characterize the performances of a snow model, the model sensitivity to errors in the input data should be quantified. In this study we test the ability of six snow models to reproduce snow water equivalent, snow density and snow depth when they are forced by meteorological input data with gradually lower accuracy. The SNOWPACK, GEOTOP, HTESSEL, UTOPIA, SMASH and S3M snow models are forced, first, with high-quality measurements performed at the experimental site of Torgnon, located at 2160 m a.s.l. in the Italian Alps (control run). Then, the models are forced by data at gradually lower temporal and/or spatial resolution, obtained by (i) sampling the original Torgnon 30 min time series at 3, 6, and 12 h, (ii) spatially interpolating neighbouring in situ station measurements and (iii) extracting information from GLDAS, ERA5 and ERA-Interim reanalyses at the grid point closest to the Torgnon site. Since the selected models are characterized by different degrees of complexity, from highly sophisticated multi-layer snow models to simple, empirical, single-layer snow schemes, we also discuss the results of these experiments in relation to the model complexity. The results show that, when forced by accurate 30 min resolution weather station data, the single-layer, intermediate-complexity snow models HTESSEL and UTOPIA provide similar skills to the more sophisticated multi-layer model SNOWPACK, and these three models show better agreement with observations and more robust performances over different seasons compared to the lower-complexity models SMASH and S3M. All models forced by 3-hourly data provide similar skills to the control run, while the use of 6- and 12-hourly temporal resolution forcings may lead to a reduction in model performances if the incoming shortwave radiation is not properly represented. The SMASH model generally shows low sensitivity to the temporal degradation of the input data. Spatially interpolated data from neighbouring stations and reanalyses are found to be adequate forcings, provided that temperature and precipitation variables are not affected by large biases over the considered period. However, a simple bias-adjustment technique applied to ERA-Interim temperatures allowed all models to achieve similar performances to the control run. Regardless of their complexity, all models show weaknesses in the representation of the snow density.


2014 ◽  
Vol 19 (1) ◽  
pp. 55-66
Author(s):  
Ramūnas Markauskas ◽  
Algimantas Juozapavičius ◽  
Kęstutis Saniukas ◽  
Giedrius Bernotavičius

In this article the authors present a method for the backbone recognition and modelling. The process of recognition combines some classical techniques (Hough transformation, GVF snakes) with some new (authors present a method for initial curvature detection, which they call the Falling Ball method). The result enables us to identify high-quality features of the spine and to detect the major deformities of backbone: the intercrestal line, centre sacral vertical line, C7 plumbline; as well as angles: proximal thoracic curve, main thoracic curve, thoracolumbar/lumbar. These features are used for measure in adolescent idiopathic scoliosis, especially in the case of treatment. Input data are just radiographic images, meet in everyday practice.


Author(s):  
Алексей Николаевич Самойлов ◽  
Юрий Михайлович Бородянский ◽  
Александр Валерьевич Волошин

В процессе автоматизации решения прикладных измерительных задач, в том числе на базе фотограмметрических методов, возникает проблема соответствия измерительной системы объекту и условиям измерения. Для того чтобы измерительная система позволяла заранее оценить возможность получения достоверных результатов, а также наилучшим образом подстраивалась под условия измерения, необходимо наличие специализированных алгоритмов и моделей. В общем случае такие модели ориентированы на квалифицированных технических специалистов, обладающих необходимыми знаниями в области информационных технологий. Особенностью применения фотограмметрических измерительных систем в лесной и металлургической промышленности является низкая квалификация пользователей в сфере информационных технологий, что обуславливается характером выполняемых работ и условиями привлечения. Данный фактор не позволяет решить задачу подстройки системы традиционными методами, в которых процессом настройки управляет пользователь. В этой связи в статье предлагается модель и алгоритм формирования измерительной системы по первичным входным данным, в котором процессом настройки управляет сама система. In the process of automating the solution of applied measurement tasks, including on the basis of photogrammetric methods, there is a problem of compliance of the measurement system with the object and measurement conditions. In order for the measuring system to assess in advance the possibility of obtaining reliable results, as well as to best adapt to the conditions of measurement, it is necessary to have specialized algorithms and models. In general, such models are aimed at qualified technicians with the necessary knowledge in the field of information technology. A feature of the application of photogrammetric measurement systems in the forestry and metallurgical industry is the low qualification of users in the field of information technology, which is determined by the nature of the work performed and the conditions of attraction. This factor does not solve the problem of adjusting the system by traditional methods in which the user controls the configuration process. In this regard, the article proposes a model and algorithm for forming a measuring system from primary input data, in which the system itself controls the adjustment process.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Ronny Peter ◽  
Luca Bifano ◽  
Gerhard Fischerauer

Abstract The quantitative determination of material parameter distributions in resonant cavities is a relatively new method for the real-time monitoring of chemical processes. For this purpose, electromagnetic resonances of the cavity resonator are used as input data for the reverse calculation (inversion). However, the reverse calculation algorithm is sensitive to disturbances of the input data, which produces measurement errors and tends to diverge, which leads to no measurement result at all. In this work a correction algorithm based on the Monte Carlo method is presented which ensures a convergent behavior of the reverse calculation algorithm.


Sign in / Sign up

Export Citation Format

Share Document