scholarly journals Gaussian Process Methodology for Multi-Frequency Marine Controlled-Source Electromagnetic Profile Estimation in Isotropic Medium

Processes ◽  
2019 ◽  
Vol 7 (10) ◽  
pp. 661 ◽  
Author(s):  
Muhammad Naeim Mohd Aris ◽  
Hanita Daud ◽  
Sarat Chandra Dass ◽  
Khairul Arifin Mohd Noh

The marine controlled-source electromagnetic (CSEM) technique is an application of electromagnetic (EM) waves to image the electrical resistivity of the subsurface underneath the seabed. The modeling of marine CSEM is a crucial and time-consuming task due to the complexity of its mathematical equations. Hence, high computational cost is incurred to solve the linear systems, especially for high-dimensional models. Addressing these problems, we propose Gaussian process (GP) calibrated with computer experiment outputs to estimate multi-frequency marine CSEM profiles at various hydrocarbon depths. This methodology utilizes prior information to provide beneficial EM profiles with uncertainty quantification in terms of variance (95% confidence interval). In this paper, prior marine CSEM information was generated through Computer Simulation Technology (CST) software at various observed hydrocarbon depths (250–2750 m with an increment of 250 m each) and different transmission frequencies (0.125, 0.25, and 0.5 Hz). A two-dimensional (2D) forward GP model was developed for every frequency by utilizing the marine CSEM information. From the results, the uncertainty measurement showed that the estimates were close to the mean. For model validation, the calculated root mean square error (RMSE) and coefficient of variation (CV) proved in good agreement between the computer output and the estimated EM profile at unobserved hydrocarbon depths.

Processes ◽  
2020 ◽  
Vol 8 (5) ◽  
pp. 605
Author(s):  
Muhammad Naeim Mohd Aris ◽  
Hanita Daud ◽  
Khairul Arifin Mohd Noh ◽  
Sarat Chandra Dass

An electromagnetic (EM) technique is employed in seabed logging (SBL) to detect offshore hydrocarbon-saturated reservoirs. In risk analysis for hydrocarbon exploration, computer simulation for subsurface modelling is a crucial task. It can be expensive and time-consuming due to its complicated mathematical equations, and only a few realizations of input-output pairs can be generated after a very lengthy computational time. Understanding the unknown functions without any uncertainty measurement could be very challenging as well. We proposed model calibration between a stochastic process and computer experiment for magnitude versus offset (MVO) analysis. Two-dimensional (2D) Gaussian process (GP) models were developed for low-frequencies of 0.0625–0.5 Hz at different hydrocarbon depths to estimate EM responses at untried observations with less time consumption. The calculated error measurements revealed that the estimates were well-matched with the computer simulation technology (CST) outputs. Then, GP was fitted in the MVO plots to provide uncertainty quantification. Based on the confidence intervals, hydrocarbons were difficult to determine especially when their depth was 3000 m from the seabed. The normalized magnitudes for other frequencies also agreed with the resulting predictive variance. Thus, the model resolution for EM data decreases as the hydrocarbon depth increases even though multi-low frequencies were exercised in the SBL application.


2020 ◽  
Vol 142 (2) ◽  
Author(s):  
Wangbai Pan ◽  
Meiyan Zhang ◽  
Guoan Tang

Abstract Mistuning phenomena exist in the bladed disk due to the inevitable deviations among blades' properties, e.g., stiffness, mass, geometry, etc., leading to localization and response amplification. The dynamic performance of mistuned bladed disk is sensitive to the arrangement of blades. The blade arrangement optimization aims to obtain the optimal arrangement that minimizes the influence of mistuning. In this paper, a framework of high efficiency is raised to deal with the challenge of high computational cost this optimization. It comprehensively utilizes mixed-dimensional finite element model (MDFEM), Gaussian process (GP) regression, and genetic algorithm (GA). The MDFEM can perform mistuned modal analysis efficiently and provides the training set of GP regression rapidly. The GP model, as a surrogate model, predicts the desired dynamic performance directly without calculating the numerical model and can function as fitness function in optimization. GA has the capability to deal with combinatorial problems and is a good option for problems with large search domains and several local maxima/minima. The techniques and processes of three methods are illustrated in detail. Case studies, based on a real turbine, are concretely presented in a gradually progressive manner to test and verify the effectiveness, accuracy, and efficiency of methods and entire framework step by step. The results show the satisfactory optimal arrangement for a randomly chosen set of mistuned blades, and the influence of mistuning is reduced indeed. The time cost of the optimization has been reduced several orders of magnitude. This framework can be a promising approach for the blade arrangement optimization problem.


2020 ◽  
Vol 176 (2) ◽  
pp. 183-203
Author(s):  
Santosh Chapaneri ◽  
Deepak Jayaswal

Modeling the music mood has wide applications in music categorization, retrieval, and recommendation systems; however, it is challenging to computationally model the affective content of music due to its subjective nature. In this work, a structured regression framework is proposed to model the valence and arousal mood dimensions of music using a single regression model at a linear computational cost. To tackle the subjectivity phenomena, a confidence-interval based estimated consensus is computed by modeling the behavior of various annotators (e.g. biased, adversarial) and is shown to perform better than using the average annotation values. For a compact feature representation of music clips, variational Bayesian inference is used to learn the Gaussian mixture model representation of acoustic features and chord-related features are used to improve the valence estimation by probing the chord progressions between chroma frames. The dimensionality of features is further reduced using an adaptive version of kernel PCA. Using an efficient implementation of twin Gaussian process for structured regression, the proposed work achieves a significant improvement in R2 for arousal and valence dimensions relative to state-of-the-art techniques on two benchmark datasets for music mood estimation.


Author(s):  
Daniel Blatter ◽  
Anandaroop Ray ◽  
Kerry Key

Summary Bayesian inversion of electromagnetic data produces crucial uncertainty information on inferred subsurface resistivity. Due to their high computational cost, however, Bayesian inverse methods have largely been restricted to computationally expedient 1D resistivity models. In this study, we successfully demonstrate, for the first time, a fully 2D, trans-dimensional Bayesian inversion of magnetotelluric data. We render this problem tractable from a computational standpoint by using a stochastic interpolation algorithm known as a Gaussian process to achieve a parsimonious parametrization of the model vis-a-vis the dense parameter grids used in numerical forward modeling codes. The Gaussian process links a trans-dimensional, parallel tempered Markov chain Monte Carlo sampler, which explores the parsimonious model space, to MARE2DEM, an adaptive finite element forward solver. MARE2DEM computes the model response using a dense parameter mesh with resistivity assigned via the Gaussian process model. We demonstrate the new trans-dimensional Gaussian process sampler by inverting both synthetic and field magnetotelluric data for 2D models of electrical resistivity, with the field data example converging within 10 days on 148 cores, a non-negligible but tractable computational cost. For a field data inversion, our algorithm achieves a parameter reduction of over 32x compared to the fixed parameter grid used for the MARE2DEM regularized inversion. Resistivity probability distributions computed from the ensemble of models produced by the inversion yield credible intervals and interquartile plots that quantitatively show the non-linear 2D uncertainty in model structure. This uncertainty could then be propagated to other physical properties that impact resistivity including bulk composition, porosity and pore-fluid content.


2021 ◽  
Vol 11 (4) ◽  
pp. 1492
Author(s):  
Hanita Daud ◽  
Muhammad Naeim Mohd Aris ◽  
Khairul Arifin Mohd Noh ◽  
Sarat Chandra Dass

Seabed logging (SBL) is an application of electromagnetic (EM) waves for detecting potential marine hydrocarbon-saturated reservoirs reliant on a source–receiver system. One of the concerns in modeling and inversion of the EM data is associated with the need for realistic representation of complex geo-electrical models. Concurrently, the corresponding algorithms of forward modeling should be robustly efficient with low computational effort for repeated use of the inversion. This work proposes a new inversion methodology which consists of two frameworks, namely Gaussian process (GP), which allows a greater flexibility in modeling a variety of EM responses, and gradient descent (GD) for finding the best minimizer (i.e., hydrocarbon depth). Computer simulation technology (CST), which uses finite element (FE), was exploited to generate prior EM responses for the GP to evaluate EM profiles at “untried” depths. Then, GD was used to minimize the mean squared error (MSE) where GP acts as its forward model. Acquiring EM responses using mesh-based algorithms is a time-consuming task. Thus, this work compared the time taken by the CST and GP in evaluating the EM profiles. For the accuracy and performance, the GP model was compared with EM responses modeled by the FE, and percentage error between the estimate and “untried” computer input was calculated. The results indicate that GP-based inverse modeling can efficiently predict the hydrocarbon depth in the SBL.


2021 ◽  
Author(s):  
Joel C. Najmon ◽  
Homero Valladares ◽  
Andres Tovar

Abstract Multiscale topology optimization (MSTO) is a numerical design approach to optimally distribute material within coupled design domains at multiple length scales. Due to the substantial computational cost of performing topology optimization at multiple scales, MSTO methods often feature subroutines such as homogenization of parameterized unit cells and inverse homogenization of periodic microstructures. Parameterized unit cells are of great practical use, but limit the design to a pre-selected cell shape. On the other hand, inverse homogenization provide a physical representation of an optimal periodic microstructure at every discrete location, but do not necessarily embody a manufacturable structure. To address these limitations, this paper introduces a Gaussian process regression model-assisted MSTO method that features the optimal distribution of material at the macroscale and topology optimization of a manufacturable microscale structure. In the proposed approach, a macroscale optimization problem is solved using a gradient-based optimizer The design variables are defined as the homogenized stiffness tensors of the microscale topologies. As such, analytical sensitivity is not possible so the sensitivity coefficients are approximated using finite differences after each microscale topology is optimized. The computational cost of optimizing each microstructure is dramatically reduced by using Gaussian process regression models to approximate the homogenized stiffness tensor. The capability of the proposed MSTO method is demonstrated with two three-dimensional numerical examples. The correlation of the Gaussian process regression models are presented along with the final multiscale topologies for the two examples: a cantilever beam and a 3-point bending beam.


2018 ◽  
Vol 168 ◽  
pp. 01008 ◽  
Author(s):  
Rong-Gen Cai ◽  
Tao Yang

The gravitational waves from compact binary systems are viewed as a standard siren to probe the evolution of the universe. This paper summarizes the potential and ability to use the gravitational waves to constrain the cosmological parameters and the dark sector interaction in the Gaussian process methodology. After briefly introducing the method to reconstruct the dark sector interaction by the Gaussian process, the concept of standard sirens and the analysis of reconstructing the dark sector interaction with LISA are outlined. Furthermore, we estimate the constraint ability of the gravitational waves on cosmological parameters with ET. The numerical methods we use are Gaussian process and the Markov-Chain Monte-Carlo. Finally, we also forecast the improvements of the abilities to constrain the cosmological parameters with ET and LISA combined with the Planck.


Author(s):  
Shuai Guo ◽  
Camilo F. Silva ◽  
Wolfgang Polifke

Abstract One of the fundamental tasks in performing robust thermoacoustic design of gas turbine combustors is calculating the modal instability risk, i.e., the probability that a thermoacoustic mode is unstable, given various sources of uncertainty (e.g., operation or boundary conditions). To alleviate the high computational cost associated with conventional Monte Carlo simulation, surrogate modeling techniques are usually employed. Unfortunately, in practice it is not uncommon that only a small number of training samples can be afforded for surrogate model training. As a result, epistemic uncertainty may be introduced by such an “inaccurate” model, provoking a variation of modal instability risk calculation. In the current study, using Gaussian Process (GP) as the surrogate model, we address the following two questions: Firstly, how to quantify the variation of modal instability risk induced by the epistemic surrogate model uncertainty? Secondly, how to reduce the variation of risk calculation given a limited computational budget for the surrogate model training? For the first question, we leverage on the Bayesian characteristic of the GP model and perform correlated sampling of the GP predictions at different inputs to quantify the uncertainty of risk calculation. We show how this uncertainty shrinks when more training samples are available. For the second question, we adopt an active learning strategy to intelligently allocate training samples, such that the trained GP model is highly accurate particularly in the vicinity of the zero growth rate contour. As a result, a more accurate and robust modal instability risk calculation is obtained without increasing the computational cost of surrogate model training.


2019 ◽  
Vol 2019 ◽  
pp. 1-11 ◽  
Author(s):  
Bin Hu ◽  
Guo-shao Su ◽  
Jianqing Jiang ◽  
Yilong Xiao

A new response surface method (RSM) for slope reliability analysis was proposed based on Gaussian process (GP) machine learning technology. The method involves the approximation of limit state function by the trained GP model and estimation of failure probability using the first-order reliability method (FORM). A small amount of training samples were firstly built by the limited equilibrium method for training the GP model. Then, the implicit limit state function of slope was approximated by the trained GP model. Thus, the implicit limit state function and its derivatives for slope stability analysis were approximated by the GP model with the explicit formulation. Furthermore, an iterative algorithm was presented to improve the precision of approximation of the limit state function at the region near the design point which contributes significantly to the failure probability. Results of four case studies including one nonslope and three slope problems indicate that the proposed method is more efficient to achieve reasonable accuracy for slope reliability analysis than the traditional RSM.


2019 ◽  
Vol 21 (Supplement_6) ◽  
pp. vi172-vi173
Author(s):  
Lujia Wang ◽  
Hyunsoo Yoon ◽  
Andrea Hawkins-Daarud ◽  
Kyle Singleton ◽  
Kamala Clark-Swanson ◽  
...  

Abstract INTRODUCTION The quantification of intratumoral heterogeneity – through radiomics-based approaches - can help resolve the regionally distinct genetic drug targets that may co-exist within a single Glioblastoma (GBM) tumor. While this offers potential diagnostic value under the paradigm of individualized oncology, clinical decision-making must also consider the degree of uncertainty associated with each model. In this study, we evaluate the performance of a novel machine-learning (ML) algorithm, called Gaussian Process (GP) modeling, that can quantify the impact of multiple sources of uncertainty in ML model development and prediction accuracy, including variabilities in the copy number measurement, radiomics features, training sample characteristics, and training sample size. METHOD We collected 95 image-localized biopsies from 25 primary GBM patients. We coregistered stereotactic locations with preoperative multi-parametric MRI features (conventional MRI, DSC perfusion, Diffusion Tensor Imaging) to generate spatially matched pairs of MRI and copy number variants (CNV) for for each biopsy. We developed a Gaussian Process (GP) model to predict CNV for Epidermal Growth Factor Receptor (EGFR) based on MRI radiomic features in each patient. We used leave-one-patient-out cross validation to quantify prediction accuracy and model uncertainty. Spatial prediction and uncertainty (p-value) maps were overlaid to help visualize regional genetic variation of EGFR and uncertainty of the radiomic predictions. RESULT: The initial GP radiomics model for EGFR amplification (CNV > 3.5) produced a sensitivity of 0.8 and specificity of 0.8. Samples/regions associated with high uncertainty (p-value >0.05) correlated with either 1) extrapolation of radiomic features from the training set-defined feature space or 2) insufficient training samples in the feature space. CONCLUSION We present a ML-based model that quantifies spatial genetic heterogeneity in GBM, while also estimating model uncertainties that result from multi-source data variabilities. This approach lays the groundwork for prospective clinical integration of modeling-based diagnostic approaches in the paradigm of individualized medicine.


Sign in / Sign up

Export Citation Format

Share Document