Multiscale Estimation of Event Arrival Times and Their Uncertainties in Hydroacoustic Records from Autonomous Oceanic Floats

2020 ◽  
Vol 110 (3) ◽  
pp. 970-997 ◽  
Author(s):  
Joel D. Simon ◽  
Frederik J. Simons ◽  
Guust Nolet

ABSTRACT We describe an algorithm to pick event onsets in noisy records, characterize their error distributions, and derive confidence intervals on their timing. Our method is based on an Akaike information criterion that identifies the partition of a time series into a noise and a signal segment that maximizes the signal-to-noise ratio. The distinctive feature of our approach lies in the timing uncertainty analysis, and in its application in the time domain and in the wavelet timescale domain. Our novel data are records collected by freely floating Mobile Earthquake Recording in Marine Areas by Independent Divers (MERMAID) instruments, midcolumn hydrophones that report triggered segments of ocean-acoustic time series.

Sensors ◽  
2018 ◽  
Vol 18 (11) ◽  
pp. 3985 ◽  
Author(s):  
Siyu Chen ◽  
Yanzhang Wang ◽  
Jun Lin

Residence time difference (RTD) fluxgate sensor is a potential device to measure the DC or low-frequency magnetic field in the time domain. Nevertheless, jitter noise and magnetic noise severely affect the detection result. A novel post-processing algorithm for jitter noise reduction of RTD fluxgate output strategy based on the single-frequency time difference (SFTD) method is proposed in this study to boost the performance of the RTD system. This algorithm extracts the signal that has a fixed frequency and preserves its time-domain information via a time–frequency transformation method. Thereby, the single-frequency signal without jitter noise, which still contains the ambient field information in its time difference, is yielded. Consequently, compared with the traditional comparator RTD method (CRTD), the stability of the RTD estimation (in other words, the signal-to-noise ratio of residence time difference) has been significantly boosted with sensitivity of 4.3 μs/nT. Furthermore, the experimental results reveal that the RTD fluxgate is comparable to harmonic fluxgate sensors, in terms of noise floor.


2008 ◽  
Vol 25 (4) ◽  
pp. 534-546 ◽  
Author(s):  
Anthony Arguez ◽  
Peng Yu ◽  
James J. O’Brien

Abstract Time series filtering (e.g., smoothing) can be done in the spectral domain without loss of endpoints. However, filtering is commonly performed in the time domain using convolutions, resulting in lost points near the series endpoints. Multiple incarnations of a least squares minimization approach are developed that retain the endpoint intervals that are normally discarded due to filtering with convolutions in the time domain. The techniques minimize the errors between the predetermined frequency response function (FRF)—a fundamental property of all filters—of interior points with FRFs that are to be determined for each position in the endpoint zone. The least squares techniques are differentiated by their constraints: 1) unconstrained, 2) equal-mean constraint, and 3) an equal-variance constraint. The equal-mean constraint forces the new weights to sum up to the same value as the predetermined weights. The equal-variance constraint forces the new weights to be such that, after convolved with the input values, the expected time series variance is preserved. The three least squares methods are each tested under three separate filtering scenarios [involving Arctic Oscillation (AO), Madden–Julian oscillation (MJO), and El Niño–Southern Oscillation (ENSO) time series] and compared to each other as well as to the spectral filtering method—the standard of comparison. The results indicate that all four methods (including the spectral method) possess skill at determining suitable endpoints estimates. However, both the unconstrained and equal-mean schemes exhibit bias toward zero near the terminal ends due to problems with appropriating variance. The equal-variance method does not show evidence of this attribute and was never the worst performer. The equal-variance method showed great promise in the ENSO project involving a 5-month running mean filter, and performed at least on par with the other realistic methods for almost all time series positions in all three filtering scenarios.


Geophysics ◽  
2009 ◽  
Vol 74 (4) ◽  
pp. J35-J48 ◽  
Author(s):  
Bernard Giroux ◽  
Abderrezak Bouchedda ◽  
Michel Chouteau

We introduce two new traveltime picking schemes developed specifically for crosshole ground-penetrating radar (GPR) applications. The main objective is to automate, at least partially, the traveltime picking procedure and to provide first-arrival times that are closer in quality to those of manual picking approaches. The first scheme is an adaptation of a method based on cross-correlation of radar traces collated in gathers according to their associated transmitter-receiver angle. A detector is added to isolate the first cycle of the radar wave and to suppress secon-dary arrivals that might be mistaken for first arrivals. To improve the accuracy of the arrival times obtained from the crosscorrelation lags, a time-rescaling scheme is implemented to resize the radar wavelets to a common time-window length. The second method is based on the Akaike information criterion(AIC) and continuous wavelet transform (CWT). It is not tied to the restrictive criterion of waveform similarity that underlies crosscorrelation approaches, which is not guaranteed for traces sorted in common ray-angle gathers. It has the advantage of being automated fully. Performances of the new algorithms are tested with synthetic and real data. In all tests, the approach that adds first-cycle isolation to the original crosscorrelation scheme improves the results. In contrast, the time-rescaling approach brings limited benefits, except when strong dispersion is present in the data. In addition, the performance of crosscorrelation picking schemes degrades for data sets with disparate waveforms despite the high signal-to-noise ratio of the data. In general, the AIC-CWT approach is more versatile and performs well on all data sets. Only with data showing low signal-to-noise ratios is the AIC-CWT superseded by the modified crosscorrelation picker.


Geophysics ◽  
2007 ◽  
Vol 72 (3) ◽  
pp. S149-S154 ◽  
Author(s):  
Antoine Guitton ◽  
Alejandro Valenciano ◽  
Dimitri Bevc ◽  
Jon Claerbout

Amplitudes in shot-profile migration can be improved if the imaging condition incorporates a division (deconvolution in the time domain) of the upgoing wavefield by the downgoing wavefield. This division can be enhanced by introducing an optimal Wiener filter which assumes that the noise present in the data has a white spectrum. This assumption requires a damping parameter, related to the signal-to-noise ratio, often chosen by trial and error. In practice, the damping parameter replaces the small values of the spectrum of the downgoing wavefield and avoids division by zero. The migration results can be quite sensitive to the damping parameter, and in most applications, the upgoing and downgoing wavefields are simply multiplied. Alternatively, the division can be made stable by filling the small values of thespectrum with an average of the neighboring points. This averaging is obtained by running a smoothing operator on the spectrum of the downgoing wavefield. This operation called the smoothing imaging condition. Our results show that where the spectrum of the downgoing wavefield is high, the imaging condition with damping and smoothing yields similar results, thus correcting for illumination effects. Where the spectrum is low, the smoothing imaging condition tends to be more robust to the noise level present in the data, thus giving better images than the imaging condition with damping. In addition, our experiments indicate that the parameterization of the smoothing imaging condition, i.e., choice of window size for the smoothing operator, is easy and repeatable from one data set to another, making it a valuable addition to our imaging toolbox.


1999 ◽  
Vol 3 (1) ◽  
pp. 69-83 ◽  
Author(s):  
Hui Boon Tan ◽  
Richard Ashley

A simple technique for directly testing the parameters of a time-series regression model for instability across frequencies is presented. The method can be implemented easily in the time domain, so that parameter instability across frequency bands can be conveniently detected and modeled in conjunction with other econometric features of the problem at hand, such as simultaneity, cointegration, missing observations, and cross-equation restrictions. The usefulness of the new technique is illustrated with an application to a cointegrated consumption-income regression model, yielding a straightforward test of the permanent income hypothesis.


Author(s):  
Simon Vaughan

Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations (‘noise’) from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.


2020 ◽  
Vol 24 (11) ◽  
pp. 5473-5489 ◽  
Author(s):  
Justin Schulte ◽  
Frederick Policielli ◽  
Benjamin Zaitchik

Abstract. Wavelet coherence is a method that is commonly used in hydrology to extract scale-dependent, nonstationary relationships between time series. However, we show that the method cannot always determine why the time-domain correlation between two time series changes in time. We show that, even for stationary coherence, the time-domain correlation between two time series weakens if at least one of the time series has changing skewness. To overcome this drawback, a nonlinear coherence method is proposed to quantify the cross-correlation between nonlinear modes embedded in the time series. It is shown that nonlinear coherence and auto-bicoherence spectra can provide additional insight into changing time-domain correlations. The new method is applied to the El Niño–Southern Oscillation (ENSO) and all-India rainfall (AIR), which is intricately linked to hydrological processes across the Indian subcontinent. The nonlinear coherence analysis showed that the skewness of AIR is weakly correlated with that of two ENSO time series after the 1970s, indicating that increases in ENSO skewness after the 1970s at least partially contributed to the weakening ENSO–AIR relationship in recent decades. The implication of this result is that the intensity of skewed El Niño events is likely to overestimate India's drought severity, which was the case in the 1997 monsoon season, a time point when the nonlinear wavelet coherence between AIR and ENSO reached its lowest value in the 1871–2016 period. We determined that the association between the weakening ENSO–AIR relationship and ENSO nonlinearity could reflect the contribution of different nonlinear ENSO modes to ENSO diversity.


2009 ◽  
Vol 6 (2) ◽  
pp. 2451-2498 ◽  
Author(s):  
B. Schaefli ◽  
E. Zehe

Abstract. This paper proposes a method for rainfall-runoff model calibration and performance analysis in the wavelet-domain by fitting the estimated wavelet-power spectrum (a representation of the time-varying frequency content of a time series) of a simulated discharge series to the one of the corresponding observed time series. As discussed in this paper, calibrating hydrological models so as to reproduce the time-varying frequency content of the observed signal can lead to different results than parameter estimation in the time-domain. Therefore, wavelet-domain parameter estimation has the potential to give new insights into model performance and to reveal model structural deficiencies. We apply the proposed method to synthetic case studies and a real-world discharge modeling case study and discuss how model diagnosis can benefit from an analysis in the wavelet-domain. The results show that for the real-world case study of precipitation – runoff modeling for a high alpine catchment, the calibrated discharge simulation captures the dynamics of the observed time series better than the results obtained through calibration in the time-domain. In addition, the wavelet-domain performance assessment of this case study highlights which frequencies are not well reproduced by the model, which gives specific indications about how to improve the model structure.


Electronics ◽  
2019 ◽  
Vol 8 (9) ◽  
pp. 1046
Author(s):  
Changyou Suo ◽  
Zhonghua Li ◽  
Yunlong Sun ◽  
Yongsen Han

The current time domain spectroscopy of dielectrics provides important information for the analysis of dielectric properties and mechanisms. However, there is always interference during the testing process, which seriously affects the analysis of the test results. Therefore, the effective filtering of current time domain spectroscopy is particularly necessary. L1 trend filtering can estimate the trend items exactly in a set of time series. It has been widely used in the fields of economics and sociology. Therefore, this paper attempts to apply L1 trend filtering to the current time domain spectroscopy. Firstly, polarization and depolarization currents are measured in the laboratory. Then the test results are filtered by L1 trend filtering and the filtering effects are compared with several common filtering algorithms, such as a sliding mean filter and Savitzky–Golay smoothing filter. Finally, the robustness and time complexity of L1 trend filtering are analyzed. The filtering results show that because the polarization currents vary in a wide range of the time domain (about 2–3 orders of magnitude), smooth and undistorted curves in the whole test time range can hardly be obtained through common filtering algorithms, while they can be obtained by L1 trend filtering. The results of robustness analysis and time complexity analysis show that L1 trend filtering can extract the trend items accurately in the time series under given different noise levels, and the execution time is also lower than 176.67 s when the number of tested points is no more than 20,000. Those results show that L1 trend filtering can be applied to the time domain current spectroscopy of dielectrics.


Sign in / Sign up

Export Citation Format

Share Document