scholarly journals The Extreme Function Theory for Damage Detection: An Application to Civil and Aerospace Structures

2021 ◽  
Vol 11 (4) ◽  
pp. 1716
Author(s):  
Davide Martucci ◽  
Marco Civera ◽  
Cecilia Surace

Any damaged condition is a rare occurrence for mechanical systems, as it is very unlikely to be observed. Thus, it represents an extreme deviation from the median of its probability distribution. It is, therefore, necessary to apply proper statistical solutions, i.e., Rare Event Modelling (REM). The classic tool for this aim is the Extreme Value Theory (EVT), which deals with uni- or multivariate scalar values. The Extreme Function Theory (EFT), on the other hand, is defined by enlarging the fundamental EVT concepts to whole functions. When combined with Gaussian Process Regression (GPR), the EFT is perfectly suited for mode shape-based outlier detection. In fact, it is possible to investigate the structure’s normal modes as a whole rather than focusing on their constituent data points, with quantifiable advantages. This provides a useful tool for Structural Health Monitoring, especially to reduce false alarms. This recently proposed methodology is here tested and validated both numerically and experimentally for different examples coming from Civil and Aerospace Engineering applications. One-dimensional beamlike elements with several boundary conditions are considered, as well as a two-dimensional plate-like spar and a frame structure.

Author(s):  
Arvind Keprate ◽  
R. M. Chandima Ratnayake ◽  
Shankar Sankararaman

The main aim of this paper is to perform the validation of the adaptive Gaussian process regression model (AGPRM) developed by the authors for the Stress Intensity Factor (SIF) prediction of a crack propagating in topside piping. For validation purposes, the values of SIF obtained from experiments available in the literature are used. Sixty-six data points (consisting of L, a, c and SIF values obtained by experiments) are used to train the AGPRM, while four independent data sets are used for validation purposes. The experimental validation of the AGPRM also consists of the comparison of the prediction accuracy of AGPRM and Finite Element Method (FEM) relative to the experimentally derived SIF values. Four metrics, namely, Root Mean Square Error (RMSE), Average Absolute Error (AAE), Maximum Absolute Error (MAE), and Coefficient of Determination (R2), are used to compare the accuracy. A case study illustrating the development and experimental validation of the AGPRM is presented. Results indicate that the prediction accuracy of the AGPRM is comparable with and even higher than that of the FEM, provided the training points of the AGPRM are aptly chosen.


2018 ◽  
Vol 2018 ◽  
pp. 1-15 ◽  
Author(s):  
Hongtai Cheng ◽  
Wei Li

Delta robot is typically mounted on a frame and performs high speed pick and place tasks from top to bottom. Because of its outstanding accelerating capability and higher center of mass, the Delta robot can generate significant frame vibration. Existing trajectory smoothing methods mainly focus on vibration reduction for the robot instead of the frame, and modifying the frame structure increases the manufacturing cost. In this paper, an acceleration profile optimization approach is proposed to reduce the Delta robot-frame vibration. The profile is determined by the maximum jerk, acceleration, and velocity. The pick and place motion (PPM) and resulting frame vibration are analyzed in frequency domain. Quantitative analysis shows that frame vibration can be reduced by altering those dynamic motion parameters. Because the analytic model is derived based on several simplifications, it cannot be directly applied. A surrogate model-based optimization method is proposed to solve the practical issues. By directly executing the PPM with different parameters and measuring the vibration, a model is derived using Gaussian Process Regression (GPR). In order to reduce the frame vibration without sacrificing robot efficiency, those two goals are fused together according to their priorities. Based on the surrogate model, a single objective optimization problem is formulated and solved by Genetic Algorithm (GA). Experimental results show effectiveness of the proposed method. Behavior of the optimal parameters also verifies the robot-frame vibration mechanism.


Author(s):  
Arvind Keprate ◽  
R. M. Chandima Ratnayake ◽  
Shankar Sankararaman

Evaluation of the stress intensity factor (SIF) for a crack propagating in a structural component is the analytical basis of linear elastic fracture mechanics (LEFM) approach. Handbook solutions give accurate SIF results for simple crack geometries. For intricate crack geometries and complex loading conditions finite element method (FEM), is used to predict SIF. The main drawback of FEM techniques is that they are prohibitively expensive in terms of computing cost and also very time consuming. In this manuscript, authors have presented a Gaussian Process Regression Model (GPRM), which may be used as an alternative to FEM for predicting SIF of a propagating crack. The GPRM is firstly trained using 70 SIF values obtained by FEM, and then validated by comparing the values of SIF predicted by GPRM and FEM for 30 data points (i.e. combination of crack size and loading). On comparing the aforementioned values the average residual percentage between the two is 2.57%, indicating good agreement between GPRM and FEM model. Also, the time required to predict SIF of 30 data points is reduced from 30 mins (for FEM) to 10 seconds with the help of proposed GPRM.


2021 ◽  
Vol 7 (2) ◽  
pp. 287-290
Author(s):  
Jannik Prüßmann ◽  
Jan Graßhoff ◽  
Philipp Rostalski

Abstract Gaussian processes are a versatile tool for data processing. Unfortunately, due to storage and runtime requirements, standard Gaussian process (GP) methods are limited to a few thousand data points. Thus, they are infeasible in most biomedical, spatio-temporal problems. The methods treated in this work cover GP inference and hyperparameter optimization, exploiting the Kronecker structure of covariance matrices. To solve regression and source separation problems, two different approaches are presented. The first approach uses efficient matrix-vector-products, whilst the second approach is based on efficient solutions to the eigendecomposition. The latter also enables efficient hyperparameter optimization. In comparison to standard GP methods, the proposed methods can be applied to very large biomedical datasets without any further performance loss and perform substantially faster. The performance is demonstrated on esophageal manometry data, where the cardiac and respiratory signal components are to be inferred by source separation.


2019 ◽  
Vol 9 (3) ◽  
pp. 20180083 ◽  
Author(s):  
Seungjoon Lee ◽  
Felix Dietrich ◽  
George E. Karniadakis ◽  
Ioannis G. Kevrekidis

In statistical modelling with Gaussian process regression, it has been shown that combining (few) high-fidelity data with (many) low-fidelity data can enhance prediction accuracy, compared to prediction based on the few high-fidelity data only. Such information fusion techniques for multi-fidelity data commonly approach the high-fidelity model f h ( t ) as a function of two variables ( t , s ), and then use f l ( t ) as the s data. More generally, the high-fidelity model can be written as a function of several variables ( t , s 1 , s 2 ….); the low-fidelity model f l and, say, some of its derivatives can then be substituted for these variables. In this paper, we will explore mathematical algorithms for multi-fidelity information fusion that use such an approach towards improving the representation of the high-fidelity function with only a few training data points. Given that f h may not be a simple function—and sometimes not even a function—of f l , we demonstrate that using additional functions of t , such as derivatives or shifts of f l , can drastically improve the approximation of f h through Gaussian processes. We also point out a connection with ‘embedology’ techniques from topology and dynamical systems. Our illustrative examples range from instructive caricatures to computational biology models, such as Hodgkin–Huxley neural oscillations.


Author(s):  
Samuel da Silva ◽  
Luis G G Villani ◽  
Marc Rebillat ◽  
Nazih Mechbal

Abstract This paper demonstrates the Gaussian process regression model's applicability combined with a nonlinear autoregressive exogenous (NARX) framework using experimental data measured with PZTs' patches bonded in a composite aeronautical structure for concerning a novel SHM strategy. A stiffened carbon-epoxy plate regarding a healthy condition and simulated damage on the center of the bottom part of the stiffener is utilized. Comparing the performance in terms of simulation errors is made to observe if the identified models can represent and predict the waveform with confidence bounds considering the confounding effect produced by noise or possible temperature variations assuming a dataset preprocessed using principal component analysis. The results of the GP-NARX identified model have attested correct classification with a reduced number of false alarms, even with model uncertainties propagation regarding healthy and damaged conditions.


Author(s):  
Lee J. Wells ◽  
Mohammed S. Shafae ◽  
Jaime A. Camelio

Ever advancing sensor and measurement technologies continually provide new opportunities for knowledge discovery and quality control (QC) strategies for complex manufacturing systems. One such state-of-the-art measurement technology currently being implemented in industry is the 3D laser scanner, which can rapidly provide millions of data points to represent an entire manufactured part’s surface. This gives 3D laser scanners a significant advantage over competing technologies that typically provide tens or hundreds of data points. Consequently, data collected from 3D laser scanners have a great potential to be used for inspecting parts for surface and feature abnormalities. The current use of 3D point clouds for part inspection falls into two main categories; 1) Extracting feature parameters, which does not complement the nature of 3D point clouds as it wastes valuable data and 2) An ad-hoc manual process where a visual representation of a point cloud (usually as deviations from nominal) is analyzed, which tends to suffer from slow, inefficient, and inconsistent inspection results. Therefore our paper proposes an approach to automate the latter approach to 3D point cloud inspection. The proposed approach uses a newly developed adaptive generalized likelihood ratio (AGLR) technique to identify the most likely size, shape, and magnitude of a potential fault within the point cloud, which transforms the ad-hoc visual inspection approach to a statistically viable automated inspection solution. In order to aid practitioners in designing and implementing an AGLR-based inspection process, our paper also reports the performance of the AGLR with respect to the probability of detecting specific size and magnitude faults in addition to the probability of a false alarms.


HUMANIS ◽  
2018 ◽  
pp. 133
Author(s):  
Kadek Adidi Saputra ◽  
Ni Made Suryati ◽  
Tjok Istri Agung Mulyawati

The study is titled "The Text of Myth Ida Ki Dukuh Sakti in Gelar temple Pakraman Gelogor village, Ubud District, Gianyar Regency; Analysis of Structure, Function, and Value”. This study aims to describe the structure of the content contained in the mythical text of Ida Ki Dukuh Sakti and to analyze the function and value. The theory used in this research is structural theory, function theory, and value theory. Phase of data provision used method of recall and technique of record. It also used interview method assisted by recording technique and record technique. In the data analysis presentation stage, informal methods are used. The content structure that built the mythical text of Ida Ki Dukuh Sakti there are three, namely opening, content, and cover. In the opening section describes the figure of Ida Ki Dukuh Sakti. In the contents describe the ceremony, sraddha as a form of bhakti, and the disaster that occurred in the village of Pakraman Gelogor. In the closing section describes the end of the mythical text of Ida Ki Dukuh Sakti. The functions contained in the mythical text of Ida Ki Dukuh Sakti, namely (1) social function, (2) function as harmony of natural element, and (3) religious function; while the values ??contained in the mythical text of Ida Ki Dukuh Sakti are (1) magical values ??and (2) the value of local wisdom.


2003 ◽  
Vol 209 ◽  
pp. 529-530
Author(s):  
Stefan Kimeswenger

The central star V4334 Sgr (Sakurai's Nova) of the planetary nebula PN G010.4+04.4 underwent in 1995–1996 the rare event of a very late helium flash shell burning. The rapid formation of a dust shell allows a precise model, basing on the dust formation history instead of the commonly used average dust formation rate and mass loss. The model applies a complete dust grain size distribution, and transiently heated grains. Only the fitting of the visual light curve gives already the complete spectral energy distribution (SED) model. No adjustment with the infrared data points is needed.


2019 ◽  
Vol 50 (3) ◽  
pp. 778-791 ◽  
Author(s):  
Bas van Stein ◽  
Hao Wang ◽  
Wojtek Kowalczyk ◽  
Michael Emmerich ◽  
Thomas Bäck

Abstract Kriging or Gaussian Process Regression is applied in many fields as a non-linear regression model as well as a surrogate model in the field of evolutionary computation. However, the computational and space complexity of Kriging, that is cubic and quadratic in the number of data points respectively, becomes a major bottleneck with more and more data available nowadays. In this paper, we propose a general methodology for the complexity reduction, called cluster Kriging, where the whole data set is partitioned into smaller clusters and multiple Kriging models are built on top of them. In addition, four Kriging approximation algorithms are proposed as candidate algorithms within the new framework. Each of these algorithms can be applied to much larger data sets while maintaining the advantages and power of Kriging. The proposed algorithms are explained in detail and compared empirically against a broad set of existing state-of-the-art Kriging approximation methods on a well-defined testing framework. According to the empirical study, the proposed algorithms consistently outperform the existing algorithms. Moreover, some practical suggestions are provided for using the proposed algorithms.


Sign in / Sign up

Export Citation Format

Share Document