Applying Support Vector Regression to Reduce the Effect of Numerical Noise and Enhance the Performance of History Matching

Author(s):  
Zhenyu Guo ◽  
Chaohui Chen ◽  
Guohua Gao ◽  
Jeroen Vink
SPE Journal ◽  
2018 ◽  
Vol 23 (06) ◽  
pp. 2428-2443 ◽  
Author(s):  
Zhenyu Guo ◽  
Chaohui Chen ◽  
Guohua Gao ◽  
Jeroen Vink

Summary Numerical optimization is an integral part of many history-matching (HM) workflows. However, the optimization performance can be affected negatively by the numerical noise existent in the forward models when the gradients are estimated numerically. As an unavoidable part of reservoir simulation, numerical noise refers to the error caused by the incomplete convergence of linear or nonlinear solvers or truncation errors caused by different timestep cuts. More precisely, the allowed solver tolerances and allowed changes of pressure and saturation imply that simulation results no longer smoothly change with changing model parameters. For HM with linear-distributed Gaussian-Newton (L-DGN), caused by the discontinuity of simulation results, the sensitivity matrix computed by linear interpolation might be less accurate, which might result in slow convergence or, even worse, failure of convergence. Recently, we have developed an HM workflow by integrating the support-vector regression (SVR) with the distributed-Gaussian-Newton (DGN) method optimization method referred to as SVR-DGN. Unlike L-DGN that computes the sensitivity matrix with a simple linear proxy, SVR-DGN computes the sensitivity matrix by taking the gradient of the SVR proxies. In this paper, we provide theoretical analysis and case studies to show that SVR-DGN can compute a more-accurate sensitivity matrix than L-DGN, and SVR-DGN is insensitive to the negative influence of numerical noise. We also propose a cost-saving training procedure by replacing bad-training points, which correspond to relatively large values of the objective function, with those training-data points (simulation data) that have smaller values of the objective function and are generated at most-recent iterations for training the SVR proxies. Both the L-DGN approach and the newly proposed SVR-DGN approach are tested first with a 2D toy problem to show the effect of numerical noise on their convergence performance. We find that their performance is comparable when the toy problem is free of numerical noise. As the numerical-noise level increases, the performance of the L-DGN degrades sharply. By contrast, the SVR-DGN performance is quite stable. Then, both methods are tested using a real-field HM example. The convergence performance of the SVR-DGN is quite robust for both the tight and loose numerical settings, whereas the performance of the L-DGN degrades significantly when loose numerical settings are applied.


2016 ◽  
Vol 136 (12) ◽  
pp. 898-907 ◽  
Author(s):  
Joao Gari da Silva Fonseca Junior ◽  
Hideaki Ohtake ◽  
Takashi Oozeki ◽  
Kazuhiko Ogimoto

2020 ◽  
Author(s):  
Avinash Wesley ◽  
Bharat Mantha ◽  
Ajay Rajeev ◽  
Aimee Taylor ◽  
Mohit Dholi ◽  
...  

2020 ◽  
Vol 25 (1) ◽  
pp. 24-38
Author(s):  
Eka Patriya

Saham adalah instrumen pasar keuangan yang banyak dipilih oleh investor sebagai alternatif sumber keuangan, akan tetapi saham yang diperjual belikan di pasar keuangan sering mengalami fluktuasi harga (naik dan turun) yang tinggi. Para investor berpeluang tidak hanya mendapat keuntungan, tetapi juga dapat mengalami kerugian di masa mendatang. Salah satu indikator yang perlu diperhatikan oleh investor dalam berinvestasi saham adalah pergerakan Indeks Harga Saham Gabungan (IHSG). Tindakan dalam menganalisa IHSG merupakan hal yang penting dilakukan oleh investor dengan tujuan untuk menemukan suatu trend atau pola yang mungkin berulang dari pergerakan harga saham masa lalu, sehingga dapat digunakan untuk memprediksi pergerakan harga saham di masa mendatang. Salah satu metode yang dapat digunakan untuk memprediksi pergerakan harga saham secara akurat adalah machine learning. Pada penelitian ini dibuat sebuah model prediksi harga penutupan IHSG menggunakan algoritma Support Vector Regression (SVR) yang menghasilkan kemampuan prediksi dan generalisasi yang baik dengan nilai RMSE training dan testing sebesar 14.334 dan 20.281, serta MAPE training dan testing sebesar 0.211% dan 0.251%. Hasil penelitian ini diharapkan dapat membantu para investor dalam mengambil keputusan untuk menyusun strategi investasi saham.


2012 ◽  
Vol 23 (9) ◽  
pp. 2336-2346
Author(s):  
Xiao-Jian DING ◽  
Yin-Liang ZHAO

Sign in / Sign up

Export Citation Format

Share Document