Reliable Reservoir Characterization and History Matching Using a Pattern Recognition Based Distance

Author(s):  
Jiyoon Lee ◽  
Jonggeun Choe

A distance is defined as a measure of dissimilarity between two reservoir models. There have been many distances proposed for fast modeling. However, some distances cause distortion or loss in original permeability distribution of models. To avoid such problems, this study proposes a pattern recognition based distance. The distance is defined by the difference of correlation coefficients between ensemble models. From multi-dimensional scaling, initial 400 ensembles are presented on 2D plane using the distance. Then 10 groups are made by K-medoids clustering. After comparing oil production from each centroid and that of the reference field, 100 models are selected around the best centroid. We validate the clustering by comparing the uncertainty range of 100, 50, and 20 ensemble members sampled from the initial 400 models in box plots and cumulative distribution functions. For a history matching and reservoir characterization, ensemble smoother is applied to the 100 models selected. The proposed method takes only 25% time for simulation showing reliable results compared with the initial 400 models.

2015 ◽  
Vol 138 (1) ◽  
Author(s):  
Jihoon Park ◽  
Jeongwoo Jin ◽  
Jonggeun Choe

For decision making, it is crucial to have proper reservoir characterization and uncertainty assessment of reservoir performances. Since initial models constructed with limited data have high uncertainty, it is essential to integrate both static and dynamic data for reliable future predictions. Uncertainty quantification is computationally demanding because it requires a lot of iterative forward simulations and optimizations in a single history matching, and multiple realizations of reservoir models should be computed. In this paper, a methodology is proposed to rapidly quantify uncertainties by combining streamline-based inversion and distance-based clustering. A distance between each reservoir model is defined as the norm of differences of generalized travel time (GTT) vectors. Then, reservoir models are grouped according to the distances and representative models are selected from each group. Inversions are performed on the representative models instead of using all models. We use generalized travel time inversion (GTTI) for the integration of dynamic data to overcome high nonlinearity and take advantage of computational efficiency. It is verified that the proposed method gathers models with both similar dynamic responses and permeability distribution. It also assesses the uncertainty of reservoir performances reliably, while reducing the amount of calculations significantly by using the representative models.


1987 ◽  
Vol 19 (3) ◽  
pp. 599-631 ◽  
Author(s):  
Joseph Abate ◽  
Ward Whitt

This paper continues an investigation of the time-dependent behavior of regulated or reflecting Brownian motion (RBM). Part I focused on RBM starting at the origin; Part II focuses on RBM starting at a fixed positive state. The first two moments of RBM as functions of time are analyzed by representing them as the difference of two increasing functions, one of which is the moment function starting at the origin studied in Part I. By appropriate normalization, the two monotone components can be converted into cumulative distribution functions that can be analyzed probabilistically, e.g., their moments can be calculated. Simple approximations are then developed by fitting convenient distributions to these moments. Overall, the analysis yields a better understanding of the way RBM and related stochastic flow systems approach steady state.


1987 ◽  
Vol 19 (03) ◽  
pp. 599-631 ◽  
Author(s):  
Joseph Abate ◽  
Ward Whitt

This paper continues an investigation of the time-dependent behavior of regulated or reflecting Brownian motion (RBM). Part I focused on RBM starting at the origin; Part II focuses on RBM starting at a fixed positive state. The first two moments of RBM as functions of time are analyzed by representing them as the difference of two increasing functions, one of which is the moment function starting at the origin studied in Part I. By appropriate normalization, the two monotone components can be converted into cumulative distribution functions that can be analyzed probabilistically, e.g., their moments can be calculated. Simple approximations are then developed by fitting convenient distributions to these moments. Overall, the analysis yields a better understanding of the way RBM and related stochastic flow systems approach steady state.


2016 ◽  
Vol 55 (1) ◽  
pp. 112-118
Author(s):  
Kazimieras Padvelskis ◽  
Ruslan Prigodin

We consider an approximation of a cumulative distribution function F(x) by the cumulative distributionfunction G(x) of the Irwin law. In this case, a function F(x) can be cumulative distribution functions of sums (products) ofindependent (dependent) random variables. Remainder term of the approximation is estimated by the cumulant method.The cumulant method is used by introducing special cumulants, satisfying the V. Statulevičius type condition. The mainresult is a nonuniform bound for the difference |F(x)-G(x)| in terms of special cumulants of the symmetric cumulativedistribution function F(x).


2010 ◽  
Vol 2010 ◽  
pp. 1-9 ◽  
Author(s):  
A. Wong

In introductory statistics texts, the power of the test of a one-sample mean when the variance is known is widely discussed. However, when the variance is unknown, the power of the Student's -test is seldom mentioned. In this note, a general methodology for obtaining inference concerning a scalar parameter of interest of any exponential family model is proposed. The method is then applied to the one-sample mean problem with unknown variance to obtain a 100% confidence interval for the power of the Student's -test that detects the difference . The calculations require only the density and the cumulative distribution functions of the standard normal distribution. In addition, the methodology presented can also be applied to determine the required sample size when the effect size and the power of a size test of mean are given.


2018 ◽  
Vol 10 (10) ◽  
pp. 1628 ◽  
Author(s):  
Hu Zhang ◽  
Ziti Jiao ◽  
Lei Chen ◽  
Yadong Dong ◽  
Xiaoning Zhang ◽  
...  

The reflectance anisotropy effect on albedo retrieval was evaluated using the Moderate Resolution Imaging Spectroradiometer (MODIS) bidirectional reflectance distribution functions (BRDFs) product, and archetypal BRDFs. Shortwave-band archetypal BRDFs were established, and validated, based on the Anisotropy Flat indeX (AFX) and time series MODIS BRDF over tile h11v03. To generate surface albedo, archetypal BRDFs were used to fit simulated reflectance, based on the least squares method. Albedo was also retrieved based on the least root-mean-square-error (RMSE) method or normalized difference vegetation index (NDVI) based prior BRDF knowledge. The difference between those albedos and the MODIS albedo was used to quantify the reflectance anisotropy effect. The albedo over tile h11v03 for day 185 in 2009 was retrieved from single directional reflectance and the third archetypal BRDF. The results show that six archetypal BRDFs are sufficient to represent the reflectance anisotropy for albedo estimation. For the data used in this study, the relative uncertainty caused by reflectance anisotropy can reach up to 7.4%, 16.2%, and 20.2% for sufficient, insufficient multi-angular and single directional observations. The intermediate archetypal BRDFs may be used to improve the albedo retrieval accuracy from insufficient or single observations with a relative uncertainty range of 8–15%.


Author(s):  
Xiaoyu Zheng ◽  
Hiroto Itoh ◽  
Hitoshi Tamaki ◽  
Yu Maruyama

The quantitative evaluation of the fission product release to the environment during a severe accident is of great importance. In the present analysis, integral severe accident code MELCOR 1.8.5 has been applied to estimating uncertainty of source term for the accident at Unit 2 of the Fukushima Daiichi nuclear power plant (NPP) as an example and to discussing important models or parameters influential to the source term. Forty-two parameters associated with models for the transportation of radioactive materials were chosen and narrowed down to 18 through a set of screening analysis. These 18 parameters in addition to 9 parameters relevant to in-vessel melt progression obtained by the preceding uncertainty study were input to the subsequent sensitivity analysis by Morris method. This one-factor-at-a-time approach can preliminarily identify inputs which have important effects on an output, and 17 important parameters were selected from the total of 27 parameters through this approach. The selected parameters have been integrated into uncertainty analysis by means of Latin Hypercube Sampling technique and Iman-Conover method, taking into account correlation between parameters. Cumulative distribution functions of representative source terms were obtained through the present uncertainty analysis assuming the failure of suppression chamber. Correlation coefficients between the outputs and uncertain input parameters have been calculated to identify parameters of great influences on source terms, which include parameters related to models on core components failure, models of aerosol dynamic process and pool scrubbing.


2016 ◽  
Vol 35 (1) ◽  
pp. 3-23 ◽  
Author(s):  
Honggeun Jo ◽  
Hyungsik Jung ◽  
Jongchan Ahn ◽  
Kyungbook Lee ◽  
Jonggeun Choe

Ensemble Kalman filter (EnKF) has been widely studied due to its excellent recursive data processing, dependable uncertainty quantification, and real-time update. However, many previous works have shown poor characterization results on channel reservoirs with non-Gaussian permeability distribution, which do not satisfy the Gaussian assumption of EnKF algorithm. To meet the assumption, normal score transformation can be applied to ensemble parameters. Even though this preserves initial permeability distribution of ensembles, it cannot provide reliable results when initial reservoir models are quite different from the reference one. In this study, an ensemble-based history matching scheme is suggested for channel reservoirs using EnKF with continuous update of channel information. We define channel information which consists of the facies ratio and the mean permeability of each rock face. These are added to the ensemble state vector of EnKF and updated recursively with other model parameters. Using the updated channel information, ensemble parameters are retransformed after each assimilation step. The proposed method gives better characterization results in case of using even poorly designed initial ensemble members. The method also alleviates overshooting problem of EnKF without further modifications of EnKF algorithm. The methodology is applied to channel reservoirs with extreme non-Gaussian permeability distribution. The result shows that the updated models can find channel pattern successfully and the uncertainty range is decreased properly to make a reasonable decision. Although initial channel information of the ensemble members shows big difference with the real one, it can be updated to follow the reference.


Mathematics ◽  
2021 ◽  
Vol 9 (9) ◽  
pp. 981
Author(s):  
Patricia Ortega-Jiménez ◽  
Miguel A. Sordo ◽  
Alfonso Suárez-Llorens

The aim of this paper is twofold. First, we show that the expectation of the absolute value of the difference between two copies, not necessarily independent, of a random variable is a measure of its variability in the sense of Bickel and Lehmann (1979). Moreover, if the two copies are negatively dependent through stochastic ordering, this measure is subadditive. The second purpose of this paper is to provide sufficient conditions for comparing several distances between pairs of random variables (with possibly different distribution functions) in terms of various stochastic orderings. Applications in actuarial and financial risk management are given.


Sign in / Sign up

Export Citation Format

Share Document