scholarly journals A remote-control datalogger for large-scale resistivity surveys and robust processing of its signals using a software Lock-In approach

2017 ◽  
Author(s):  
Frank Oppermann ◽  
Thomas Günther

Abstract. We present a new versatile datalogger that can be used for a wide range of possible applications in geosciences. It is adjustable in signal strength and sampling frequency, battery-saving and can remotely be controlled over Global System for Mobile Communication (GSM) connection so that it saves running costs, particulaly in monitoring experiments. Internet connection allows for checking functionality, controlling schedules and optimizing preamplification. We mainly use it for large-scale Electrical Resistivity Tomography (ERT), where it independently registers voltage time series on three channels while a square wave current is injected. For the analysis of this time series we present a new approach that is based on the Lock-In (LI) method, mainly known from electronic circuits. The method searches the working point (phase) using three different functions based on a mask signal, and determines the amplitude using a direct current (DC) correlation function. We use synthetic data with different types of noise to compare the new method with existing approaches, i.e. selective stacking and a modified Fast Fourier Transformation (FFT) based approach that assumes a 1/f noise characteristics. All methods give comparable results, the LI being better than the well established stacking method. The FFT approach can be even better but only if the noise strictly follows the assumed characteristics. If overshoots are present in the data, which is typical in the field, FFT performs worse even with good data which is why we conclude that the new LI approach is the most robust solution. This is also proved by a field data set from a long 2D ERT profile.

2018 ◽  
Vol 7 (1) ◽  
pp. 55-66 ◽  
Author(s):  
Frank Oppermann ◽  
Thomas Günther

Abstract. We present a new versatile datalogger that can be used for a wide range of possible applications in geosciences. It is adjustable in signal strength and sampling frequency, battery saving and can remotely be controlled over a Global System for Mobile Communication (GSM) connection so that it saves running costs, particularly in monitoring experiments. The internet connection allows for checking functionality, controlling schedules and optimizing pre-amplification. We mainly use it for large-scale electrical resistivity tomography (ERT), where it independently registers voltage time series on three channels, while a square-wave current is injected. For the analysis of this time series we present a new approach that is based on the lock-in (LI) method, mainly known from electronic circuits. The method searches the working point (phase) using three different functions based on a mask signal, and determines the amplitude using a direct current (DC) correlation function. We use synthetic data with different types of noise to compare the new method with existing approaches, i.e. selective stacking and a modified fast Fourier transformation (FFT)-based approach that assumes a 1∕f noise characteristics. All methods give comparable results, but the LI is better than the well-established stacking method. The FFT approach can be even better but only if the noise strictly follows the assumed characteristics. If overshoots are present in the data, which is typical in the field, FFT performs worse even with good data, which is why we conclude that the new LI approach is the most robust solution. This is also proved by a field data set from a long 2-D ERT profile.


Author(s):  
Eun-Young Mun ◽  
Anne E. Ray

Integrative data analysis (IDA) is a promising new approach in psychological research and has been well received in the field of alcohol research. This chapter provides a larger unifying research synthesis framework for IDA. Major advantages of IDA of individual participant-level data include better and more flexible ways to examine subgroups, model complex relationships, deal with methodological and clinical heterogeneity, and examine infrequently occurring behaviors. However, between-study heterogeneity in measures, designs, and samples and systematic study-level missing data are significant barriers to IDA and, more broadly, to large-scale research synthesis. Based on the authors’ experience working on the Project INTEGRATE data set, which combined individual participant-level data from 24 independent college brief alcohol intervention studies, it is also recognized that IDA investigations require a wide range of expertise and considerable resources and that some minimum standards for reporting IDA studies may be needed to improve transparency and quality of evidence.


2021 ◽  
Author(s):  
Andrew J Kavran ◽  
Aaron Clauset

Abstract Background: Large-scale biological data sets are often contaminated by noise, which can impede accurate inferences about underlying processes. Such measurement noise can arise from endogenous biological factors like cell cycle and life history variation, and from exogenous technical factors like sample preparation and instrument variation.Results: We describe a general method for automatically reducing noise in large-scale biological data sets. This method uses an interaction network to identify groups of correlated or anti-correlated measurements that can be combined or “filtered” to better recover an underlying biological signal. Similar to the process of denoising an image, a single network filter may be applied to an entire system, or the system may be first decomposed into distinct modules and a different filter applied to each. Applied to synthetic data with known network structure and signal, network filters accurately reduce noise across a wide range of noise levels and structures. Applied to a machine learning task of predicting changes in human protein expression in healthy and cancerous tissues, network filtering prior to training increases accuracy up to 43% compared to using unfiltered data.Conclusions: Network filters are a general way to denoise biological data and can account for both correlation and anti-correlation between different measurements. Furthermore, we find that partitioning a network prior to filtering can significantly reduce errors in networks with heterogenous data and correlation patterns, and this approach outperforms existing diffusion based methods. Our results on proteomics data indicate the broad potential utility of network filters to applications in systems biology.


2020 ◽  
Author(s):  
Yuan Yuan ◽  
Lei Lin

Satellite image time series (SITS) classification is a major research topic in remote sensing and is relevant for a wide range of applications. Deep learning approaches have been commonly employed for SITS classification and have provided state-of-the-art performance. However, deep learning methods suffer from overfitting when labeled data is scarce. To address this problem, we propose a novel self-supervised pre-training scheme to initialize a Transformer-based network by utilizing large-scale unlabeled data. In detail, the model is asked to predict randomly contaminated observations given an entire time series of a pixel. The main idea of our proposal is to leverage the inherent temporal structure of satellite time series to learn general-purpose spectral-temporal representations related to land cover semantics. Once pre-training is completed, the pre-trained network can be further adapted to various SITS classification tasks by fine-tuning all the model parameters on small-scale task-related labeled data. In this way, the general knowledge and representations about SITS can be transferred to a label-scarce task, thereby improving the generalization performance of the model as well as reducing the risk of overfitting. Comprehensive experiments have been carried out on three benchmark datasets over large study areas. Experimental results demonstrate the effectiveness of the proposed method, leading to a classification accuracy increment up to 1.91% to 6.69%. <div><b>This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible.</b></div>


2017 ◽  
Vol 44 (2) ◽  
pp. 203-229 ◽  
Author(s):  
Javier D Fernández ◽  
Miguel A Martínez-Prieto ◽  
Pablo de la Fuente Redondo ◽  
Claudio Gutiérrez

The publication of semantic web data, commonly represented in Resource Description Framework (RDF), has experienced outstanding growth over the last few years. Data from all fields of knowledge are shared publicly and interconnected in active initiatives such as Linked Open Data. However, despite the increasing availability of applications managing large-scale RDF information such as RDF stores and reasoning tools, little attention has been given to the structural features emerging in real-world RDF data. Our work addresses this issue by proposing specific metrics to characterise RDF data. We specifically focus on revealing the redundancy of each data set, as well as common structural patterns. We evaluate the proposed metrics on several data sets, which cover a wide range of designs and models. Our findings provide a basis for more efficient RDF data structures, indexes and compressors.


2020 ◽  
Vol 12 (19) ◽  
pp. 3207
Author(s):  
Ioannis Papoutsis ◽  
Charalampos Kontoes ◽  
Stavroula Alatza ◽  
Alexis Apostolakis ◽  
Constantinos Loupasakis

Advances in synthetic aperture radar (SAR) interferometry have enabled the seamless monitoring of the Earth’s crust deformation. The dense archive of the Sentinel-1 Copernicus mission provides unprecedented spatial and temporal coverage; however, time-series analysis of such big data volumes requires high computational efficiency. We present a parallelized-PSI (P-PSI), a novel, parallelized, and end-to-end processing chain for the fully automated assessment of line-of-sight ground velocities through persistent scatterer interferometry (PSI), tailored to scale to the vast multitemporal archive of Sentinel-1 data. P-PSI is designed to transparently access different and complementary Sentinel-1 repositories, and download the appropriate datasets for PSI. To make it efficient for large-scale applications, we re-engineered and parallelized interferogram creation and multitemporal interferometric processing, and introduced distributed implementations to best use computing cores and provide resourceful storage management. We propose a new algorithm to further enhance the processing efficiency, which establishes a non-uniform patch grid considering land use, based on the expected number of persistent scatterers. P-PSI achieves an overall speed-up by a factor of five for a full Sentinel-1 frame for processing in a 20-core server. The processing chain is tested on a large-scale project to calculate and monitor deformation patterns over the entire extent of the Greek territory—our own Interferometric SAR (InSAR) Greece project. Time-series InSAR analysis was performed on volumes of about 12 TB input data corresponding to more than 760 Single Look Complex Sentinel-1A and B images mostly covering mainland Greece in the period of 2015–2019. InSAR Greece provides detailed ground motion information on more than 12 million distinct locations, providing completely new insights into the impact of geophysical and anthropogenic activities at this geographic scale. This new information is critical to enhancing our understanding of the underlying mechanisms, providing valuable input into risk assessment models. We showcase this through the identification of various characteristic geohazard locations in Greece and discuss their criticality. The selected geohazard locations, among a thousand, cover a wide range of catastrophic events including landslides, land subsidence, and structural failures of various scales, ranging from a few hundredths of square meters up to the basin scale. The study enriches the large catalog of geophysical related phenomena maintained by the GeObservatory portal of the Center of Earth Observation Research and Satellite Remote Sensing BEYOND of the National Observatory of Athens for the opening of new knowledge to the wider scientific community.


2015 ◽  
Vol 8 (11) ◽  
pp. 4645-4655 ◽  
Author(s):  
B. Ehard ◽  
B. Kaifler ◽  
N. Kaifler ◽  
M. Rapp

Abstract. This study evaluates commonly used methods of extracting gravity-wave-induced temperature perturbations from lidar measurements. The spectral response of these methods is characterized with the help of a synthetic data set with known temperature perturbations added to a realistic background temperature profile. The simulations are carried out with the background temperature being either constant or varying in time to evaluate the sensitivity to temperature perturbations not caused by gravity waves. The different methods are applied to lidar measurements over New Zealand, and the performance of the algorithms is evaluated. We find that the Butterworth filter performs best if gravity waves over a wide range of periods are to be extracted from lidar temperature measurements. The running mean method gives good results if only gravity waves with short periods are to be analyzed.


2019 ◽  
Vol 10 (1) ◽  
pp. 73 ◽  
Author(s):  
Einar Agletdinov ◽  
Dmitry Merson ◽  
Alexei Vinogradov

A novel methodology is proposed to enhance the reliability of detection of low amplitude transients in a noisy time series. Such time series often arise in a wide range of practical situations where different sensors are used for condition monitoring of mechanical systems, integrity assessment of industrial facilities and/or microseismicity studies. In all these cases, the early and reliable detection of possible damage is of paramount importance and is practically limited by detectability of transient signals on the background of random noise. The proposed triggering algorithm is based on a logarithmic derivative of the power spectral density function. It was tested on the synthetic data, which mimics the actual ultrasonic acoustic emission signal recorded continuously with different signal-to-noise ratios (SNR). Considerable advantages of the proposed method over established fixed amplitude threshold and STA/LTA (Short Time Average / Long Time Average) techniques are demonstrated in comparative tests.


2020 ◽  
Vol 35 (2) ◽  
pp. 214-222
Author(s):  
Lisa Cenek ◽  
Liubou Klindziuk ◽  
Cindy Lopez ◽  
Eleanor McCartney ◽  
Blanca Martin Burgos ◽  
...  

Circadian rhythms are daily oscillations in physiology and behavior that can be assessed by recording body temperature, locomotor activity, or bioluminescent reporters, among other measures. These different types of data can vary greatly in waveform, noise characteristics, typical sampling rate, and length of recording. We developed 2 Shiny apps for exploration of these data, enabling visualization and analysis of circadian parameters such as period and phase. Methods include the discrete wavelet transform, sine fitting, the Lomb-Scargle periodogram, autocorrelation, and maximum entropy spectral analysis, giving a sense of how well each method works on each type of data. The apps also provide educational overviews and guidance for these methods, supporting the training of those new to this type of analysis. CIRCADA-E (Circadian App for Data Analysis–Experimental Time Series) allows users to explore a large curated experimental data set with mouse body temperature, locomotor activity, and PER2::LUC rhythms recorded from multiple tissues. CIRCADA-S (Circadian App for Data Analysis–Synthetic Time Series) generates and analyzes time series with user-specified parameters, thereby demonstrating how the accuracy of period and phase estimation depends on the type and level of noise, sampling rate, length of recording, and method. We demonstrate the potential uses of the apps through 2 in silico case studies.


Geophysics ◽  
2016 ◽  
Vol 81 (1) ◽  
pp. V7-V16 ◽  
Author(s):  
Kenji Nose-Filho ◽  
André K. Takahata ◽  
Renato Lopes ◽  
João M. T. Romano

We have addressed blind deconvolution in a multichannel framework. Recently, a robust solution to this problem based on a Bayesian approach called sparse multichannel blind deconvolution (SMBD) was proposed in the literature with interesting results. However, its computational complexity can be high. We have proposed a fast algorithm based on the minimum entropy deconvolution, which is considerably less expensive. We designed the deconvolution filter to minimize a normalized version of the hybrid [Formula: see text]-norm loss function. This is in contrast to the SMBD, in which the hybrid [Formula: see text]-norm function is used as a regularization term to directly determine the deconvolved signal. Results with synthetic data determined that the performance of the obtained deconvolution filter was similar to the one obtained in a supervised framework. Similar results were also obtained in a real marine data set for both techniques.


Sign in / Sign up

Export Citation Format

Share Document