scholarly journals Multilevel RTN Removal Tools for Dynamic FBG Strain Measurements Corrupted by Peak-Splitting Artefacts

Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 92
Author(s):  
Dominik Johannes Marius Fallais ◽  
Maximilian Henkel ◽  
Nymfa Noppe ◽  
Wout Weijtjens ◽  
Christof Devriendt

Strain measurements using fibre Bragg grating (FBG) optical sensors are becoming ever more commonplace. However, in some cases, these measurements can become corrupted by sudden jumps in the signal, which manifest as spikes or step-like offsets in the data. These jumps are caused by a defect in the FBG itself, which is referred to as peak-splitting. The effects of peak splitting artefacts on FBG strain measurements show similarities with an additive multi-level telegraph noise process, in which the amplitudes and occurrences of the jumps are related to fibre deformation states. Whenever it is not possible to re-assess the raw spectral data with advanced peak tracking software, other means for removing the jumps from the data have to be found. The two methods presented in this article are aimed at removing additive multi-level random telegraph noise (RTN) from the raw data. Both methods are based on denoising the sample wise difference signal using a combination of an outlier detection scheme followed by an outlier replacement step. Once the difference signal has been denoised, the cumulative sum is used to arrive back at a strain time series. Two methods will be demonstrated for reconstructing severely corrupted strain time series; the data for this verification has been collected from sub-soil strain measurements obtained from an operational offshore wind-turbine. The results show that the proposed methods can be used effectively to reconstruct the dynamic content of the corrupted strain time series. It has been illustrated that errors in the outlier replacements accumulate and can cause a quasi-static drift. A representative mean value and drift correction are proposed in terms of an optimization problem, which maximizes the overlap between the reconstruction and a subset of the raw data; whereas a high-pass filter is suggested to remove the quasi static drift if only the dynamic band of the signal is of interest.

2017 ◽  
Vol 26 (1) ◽  
pp. 018502 ◽  
Author(s):  
Yiming Liao ◽  
Xiaoli Ji ◽  
Yue Xu ◽  
Chengxu Zhang ◽  
Qiang Guo ◽  
...  

2021 ◽  
Author(s):  
Giovanni Nico ◽  
Pier Francesco Biagi ◽  
Anita Ermini ◽  
Mohammed Yahia Boudjada ◽  
Hans Ulrich Eichelberger ◽  
...  

<p>Since 2009, several radio receivers have been installed throughout Europe in order to realize the INFREP European radio network for studying the VLF (10-50 kHz) and LF (150-300 kHz) radio precursors of earthquakes. Precursors can be related to “anomalies” in the night-time behavior of  VLF signals. A suitable method of analysis is the use of the Wavelet spectra.  Using the “Morlet function”, the Wavelet transform of a time signal is a complex series that can be usefully represented by its square amplitude, i.e. considering the so-called Wavelet power spectrum.</p><p>The power spectrum is a 2D diagram that, once properly normalized with respect to the power of the white noise, gives information on the strength and precise time of occurrence of the various Fourier components, which are present in the original time series. The main difference between the Wavelet power spectra and the Fourier power spectra for the time series is that the former identifies the frequency content along the operational time, which cannot be done with the latter. Anomalies are identified as regions of the Wavelet spectrogram characterized by a sudden increase in the power strength.</p><p>On January 30, 2020 an earthquake with Mw= 6.0 occurred in Dodecanese Islands. The results of the Wavelet analysis carried out on data collected some INFREP receivers is compared with the trends of the raw data. The time series from January 24, 2020 till January 31, 2000 was analyzed. The Wavelet spectrogram shows a peak corresponding to a period of 1 day on the days before January 30. This anomaly was found for signals transmitted at the frequencies 19,58 kHz, 20, 27 kHz, 23,40 kHz with an energy in the peak increasing from 19,58 kHz to 23,40 kHz. In particular, the signal at the frequency 19,58 kHz, shows a peak on January 29, while the frequencies 20,27 kHz and 23,40 kHz are characterized by a peak starting on January 28 and continuing to January 29. The results presented in this work shows the perspective use of the Wavelet spectrum analysis as an operational tool for the detection of anomalies in VLF and LF signal potentially related to EQ precursors.</p>


Author(s):  
BRANDON WHITCHER ◽  
PETER F. CRAIGMILE

We investigate the use of Hilbert wavelet pairs (HWPs) in the non-decimated discrete wavelet transform for the time-varying spectral analysis of multivariate time series. HWPs consist of two high-pass and two low-pass compactly supported filters, such that one high-pass filter is the Hilbert transform (approximately) of the other. Thus, common quantities in the spectral analysis of time series (e.g., power spectrum, coherence, phase) may be estimated in both time and frequency. Compact support of the wavelet filters ensures that the frequency axis will be partitioned dyadically as with the usual discrete wavelet transform. The proposed methodology is used to analyze a bivariate time series of zonal (u) and meridional (v) winds over Truk Island.


2019 ◽  
Vol 11 (6) ◽  
pp. 640 ◽  
Author(s):  
Beibei Wang ◽  
Zhenjie Chen ◽  
A-Xing Zhu ◽  
Yuzhu Hao ◽  
Changqing Xu

As urbanization has profound effects on global environmental changes, quick and accurate monitoring of the dynamic changes in impervious surfaces is of great significance for environmental protection. The increased spatiotemporal resolution of imagery makes it possible to construct time series to obtain long-time-period and high-accuracy information about impervious surface expansion. In this study, a three-step monitoring method based on time series trajectory segmentation was developed to extract impervious surface expansion using Landsat time series and was applied to the Xinbei District, Changzhou, China, from 2005 to 2017. Firstly, the original time series was segmented and fitted to remove the noise caused by clouds, shadows, and interannual differences, leaving only the trend information. Secondly, the time series trajectory features of impervious surface expansion were described using three phases and four types with nine parameters by analyzing the trajectory characteristics. Thirdly, a multi-level classification method was used to determine the scope of impervious surface expansion, and the expansion time was superimposed to obtain a spatiotemporal distribution map. The proposed method yielded an overall accuracy of 90.58% and a Kappa coefficient of 0.90, demonstrating that Landsat time series remote sensing images could be used effectively in this approach to monitor the spatiotemporal expansion of impervious surfaces.


2020 ◽  
Author(s):  
Oleg Skrynyk ◽  
Enric Aguilar ◽  
José A. Guijarro ◽  
Sergiy Bubin

<p>Before using climatological time series in research studies, it is necessary to perform their quality control and homogenization in order to remove possible artefacts (inhomogeneities) usually present in the raw data sets. In the vast majority of cases, the homogenization procedure allows to improve the consistency of the data, which then can be verified by means of the statistical comparison of the raw and homogenized time series. However, a new question then arises: how far are the homogenized data from the true climate signal or, in other words, what errors could still be present in homogenized data?</p><p>The main objective of our work is to estimate the uncertainty produced by the adjustment algorithm of the widely used Climatol homogenization software when homogenizing daily time series of the additive climate variables. We focused our efforts on the minimum and maximum air temperature. In order to achieve our goal we used a benchmark data set created by the INDECIS<sup>*</sup> project. The benchmark contains clean data, extracted from an output of the Royal Netherlands Meteorological Institute Regional Atmospheric Climate Model (version 2) driven by Hadley Global Environment Model 2 - Earth System, and inhomogeneous data, created by introducing realistic breaks and errors.</p><p>The statistical evaluation of discrepancies between the homogenized (by means of Climatol with predefined break points) and clean data sets was performed using both a set of standard parameters and a metrics introduced in our work. All metrics used clearly identifies the main features of errors (systematic and random) present in the homogenized time series. We calculated the metrics for every time series (only over adjusted segments) as well as their averaged values as measures of uncertainties in the whole data set.</p><p>In order to determine how the two key parameters of the raw data collection, namely the length of time series and station density, influence the calculated measures of the adjustment error we gradually decreased the length of the period and number of stations in the area under study. The total number of cases considered was 56, including 7 time periods (1950-2005, 1954-2005, …, 1974-2005) and 8 different quantities of stations (100, 90, …, 30). Additionally, in order to find out how stable are the calculated metrics for each of the 56 cases and determine their confidence intervals we performed 100 random permutations in the introduced inhomogeneity time series and repeated our calculations With that the total number of homogenization exercises performed was 5600 for each of two climate variables.</p><p>Lastly, the calculated metrics were compared with the corresponding values, obtained for raw time series. The comparison showed some substantial improvement of the metric values after homogenization in each of the 56 cases considered (for the both variables).</p><p>-------------------</p><p><sup>*</sup>INDECIS is a part of ERA4CS, an ERA-NET initiated by JPI Climate, and funded by FORMAS (SE), DLR (DE), BMWFW (AT), IFD (DK), MINECO (ES), ANR (FR) with co-funding by the European Union (Grant 690462). The work has been partially supported by the Ministry of Education and Science of Kazakhstan (Grant BR05236454) and Nazarbayev University (Grant 090118FD5345).</p>


2019 ◽  
Vol 11 (1) ◽  
pp. 87-91
Author(s):  
Xiaojuan Zhao ◽  
Wenbing Fan ◽  
Yaping Wang ◽  
Haoliang Li ◽  
Xiaonan Yang

2012 ◽  
Vol 14 (3) ◽  
pp. 574-584 ◽  
Author(s):  
B. Bhattacharya ◽  
T. van Kessel ◽  
D. P. Solomatine

A problem of predicting suspended particulate matter (SPM) concentration on the basis of wind and wave measurements and estimates of bed shear stress done by a numerical model is considered. Data at a location at 10 km offshore from Noordwijk in the Dutch coastal area is used. The time series data have been filtered with a low pass filter to remove short-term fluctuations due to noise and tides and the resulting time series have been used to build an artificial neural network (ANN) model. The accuracy of the ANN model during both storm and calm periods was found to be high. The possibilities to apply the trained ANN model at other locations, where the model is assisted by the correctors based on the ratio of long-term average SPM values for the considered location to that for Noordwijk (for which the model was trained), have been investigated. These experiments demonstrated that the ANN model's accuracy at the other locations was acceptable, which shows the potential of the considered approach.


2011 ◽  
Vol 63 (3) ◽  
pp. 369-376 ◽  
Author(s):  
M. Métadier ◽  
J. -L. Bertrand-Krajewski

With the increasing implementation of continuous monitoring of both discharge and water quality in sewer systems, large data bases are now available. In order to manage large amounts of data and calculate various variables and indicators of interest it is necessary to apply automated methods for data processing. This paper deals with the processing of short time step turbidity time series to estimate TSS (Total Suspended Solids) and COD (Chemical Oxygen Demand) event loads in sewer systems during storm events and their associated uncertainties. The following steps are described: (i) sensor calibration, (ii) estimation of data uncertainties, (iii) correction of raw data, (iv) data pre-validation tests, (v) final validation, and (vi) calculation of TSS and COD event loads and estimation of their uncertainties. These steps have been implemented in an integrated software tool. Examples of results are given for a set of 33 storm events monitored in a stormwater separate sewer system.


Sign in / Sign up

Export Citation Format

Share Document