The sign filter for seismic event detection

Geophysics ◽  
1988 ◽  
Vol 53 (8) ◽  
pp. 1024-1033 ◽  
Author(s):  
Katherine M. Hansen ◽  
Kabir Roy‐Chowdhury ◽  
Robert A. Phinney

The theory of statistical hypothesis testing is used to develop and apply a seismic signal detection filter. The filter, herein named the sign filter, scans a stacked section and designates a linear segment as “signal” or “noise” based on the value of the sign test statistic evaluated over the amplitudes within the segment; only the signals are passed. The sign test statistic is nonparametric, so that probabilistic calculations related to the filtering process do not require rigid assumptions regarding the noise distribution. Consequently, it is possible to calculate both the probability that the filter will pass a segment containing only noise, and the expected number of noise‐only segments to be passed. These numbers may be adjusted by changing the tunable parameters of the filter. The detector was tested on both synthetic and field data. For synthetic data, all of the signals present in the data were identified, and the output did not contain any spurious signals, even for a signal‐to‐noise ratio smaller than 1. For field data, the events chosen by the filter, for the most part, agree closely with those visible in the input section; and much of the spatially incoherent energy is suppressed. A few of the passed segments were not visually coherent in the input stack; we suggest a method by which such segments might be identified and removed. The method is fairly general and may be modified for different definitions of signal. The case of linear alignments is the easiest to implement, and the detector promises to be useful in both the processing (automatic picking of first arrivals in source gathers) and interpretation (identification of primary reflections in stacked sections) phases of seismic data analysis.

Author(s):  
Alma Andersson ◽  
Joakim Lundeberg

Abstract Motivation Collection of spatial signals in large numbers has become a routine task in multiple omics-fields, but parsing of these rich datasets still pose certain challenges. In whole or near-full transcriptome spatial techniques, spurious expression profiles are intermixed with those exhibiting an organized structure. To distinguish profiles with spatial patterns from the background noise, a metric that enables quantification of spatial structure is desirable. Current methods designed for similar purposes tend to be built around a framework of statistical hypothesis testing, hence we were compelled to explore a fundamentally different strategy. Results We propose an unexplored approach to analyze spatial transcriptomics data, simulating diffusion of individual transcripts to extract genes with spatial patterns. The method performed as expected when presented with synthetic data. When applied to real data, it identified genes with distinct spatial profiles, involved in key biological processes or characteristic for certain cell types. Compared to existing methods, ours seemed to be less informed by the genes’ expression levels and showed better time performance when run with multiple cores. Availabilityand implementation Open-source Python package with a command line interface (CLI), freely available at https://github.com/almaan/sepal under an MIT licence. A mirror of the GitHub repository can be found at Zenodo, doi: 10.5281/zenodo.4573237. Supplementary information Supplementary data are available at Bioinformatics online.


Geophysics ◽  
2020 ◽  
Vol 85 (3) ◽  
pp. Q1-Q10 ◽  
Author(s):  
Kai Lu ◽  
Sergio Chávez-Pérez

We have developed the theory and practice of 3D supervirtual interferometry (SVI) for enhancing the signal-to-noise ratio (S/N) of refraction arrivals in 3D data. Unlike 2D SVI, 3D SVI requires an extra integration along the inline direction to compute the stationary source-receiver pairs for enhanced stacking of the refraction events. The result is a significant increase in the S/N of first arrivals in the far-offset traces. We have evaluated 3D synthetic and field data examples to demonstrate the effectiveness of the proposed method. For the synthetic data tests, SVI has extended the source-receiver offset range of pickable traces from 11 to 15 km. In the field data example, SVI has extended the source-receiver offset of traces with pickable first-arrival traveltimes from 12 km to a maximum of 18 km, and the total number of reliable traveltime picks has increased by 12%, which contributes to a deeper velocity update in the traveltime tomogram.


Geophysics ◽  
2020 ◽  
Vol 85 (4) ◽  
pp. V367-V376 ◽  
Author(s):  
Omar M. Saad ◽  
Yangkang Chen

Attenuation of seismic random noise is considered an important processing step to enhance the signal-to-noise ratio of seismic data. A new approach is proposed to attenuate random noise based on a deep-denoising autoencoder (DDAE). In this approach, the time-series seismic data are used as an input for the DDAE. The DDAE encodes the input seismic data to multiple levels of abstraction, and then it decodes those levels to reconstruct the seismic signal without noise. The DDAE is pretrained in a supervised way using synthetic data; following this, the pretrained model is used to denoise the field data set in an unsupervised scheme using a new customized loss function. We have assessed the proposed algorithm based on four synthetic data sets and two field examples, and we compare the results with several benchmark algorithms, such as f- x deconvolution ( f- x deconv) and the f- x singular spectrum analysis ( f- x SSA). As a result, our algorithm succeeds in attenuating the random noise in an effective manner.


Mathematics ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 551
Author(s):  
Jung-Lin Hung ◽  
Cheng-Che Chen ◽  
Chun-Mei Lai

Taking advantage of the possibility of fuzzy test statistic falling in the rejection region, a statistical hypothesis testing approach for fuzzy data is proposed in this study. In contrast to classical statistical testing, which yields a binary decision to reject or to accept a null hypothesis, the proposed approach is to determine the possibility of accepting a null hypothesis (or alternative hypothesis). When data are crisp, the proposed approach reduces to the classical hypothesis testing approach.


2021 ◽  
Author(s):  
Cristina Prieto ◽  
Dmitri Kavetski ◽  
Nataliya Nataliya Le Vine ◽  
César Álvarez ◽  
Raúl Medina

<p>In hydrological modelling, the identification of hydrological model mechanisms best suited for representing individual hydrological (physical) processes is a major research and operational challenge. We present a statistical hypothesis-testing perspective to identify dominant hydrological mechanism. The method combines: (i) Bayesian estimation of posterior probabilities of individual mechanisms from a given ensemble of model structures; (ii) a test statistic that defines a “dominant” mechanism as a mechanism more probable than all its alternatives given observed data; (iii) a flexible modelling framework to generate model structures using combinations of available mechanisms. The uncertainty in the test statistic is approximated via bootstrap from the ensemble of model structures. Synthetic and real data experiments are conducted using 624 model structures from the hydrological modelling system FUSE and data from the Leizarán catchment in northern Spain. The findings show that the mechanism identification method is reliable: it identifies the correct mechanism as dominant in all synthetic trials where an identification is made. As data/model errors increase, statistical power (identifiability) decreases, manifesting as trials where no mechanism is identified as dominant. The real data case study results are broadly consistent with the synthetic analysis, with dominant mechanisms identified for 4 of 7 processes. Insights on which processes are most/least identifiable are also reported. The mechanism identification method is expected to contribute to broader community efforts on improving model identification and process representation in hydrology.</p>


Author(s):  
Manuel García-Magariños ◽  
Thore Egeland ◽  
Ignacio López-de-Ullibarri ◽  
Nils L. Hjort ◽  
Antonio Salas

AbstractThere is a large number of applications where family relationships need to be determined from DNA data. In forensic science, competing ideas are in general verbally formulated as the two hypotheses of a test. For the most common paternity case, the null hypothesis states that the alleged father is the true father against the alternative hypothesis that the father is an unrelated man. A likelihood ratio is calculated to summarize the evidence. We propose an alternative framework whereby a model and the hypotheses are formulated in terms of parameters representing identity-by-descent probabilities. There are several advantages to this approach. Firstly, the alternative hypothesis can be completely general. Specifically, the alternative does not need to specify an unrelated man. Secondly, the parametric formulation corresponds to the approach used in most other applications of statistical hypothesis testing and so there is a large theory of classical statistics that can be applied. Theoretical properties of the test statistic under the null hypothesis are studied. An extension to trios of individuals has been carried out. The methods are exemplified using simulations and a real dataset of 27 Spanish Romani individuals.


2018 ◽  
pp. 73-78
Author(s):  
Yu. V. Morozov ◽  
M. A. Rajfeld ◽  
A. A. Spektor

The paper proposes the model of a person seismic signal with noise for the investigation of passive seismic location system characteristics. The known models based on Gabor and Berlage pulses have been analyzed. These models are not able wholly to consider statistical properties of seismic signals. The proposed model is based on almost cyclic character of seismic signals, Gauss character of fluctuations inside a pulse, random amplitude change from pulse to pulse and relatively small fluctuation of separate pulses positions. The simulation procedure consists of passing the white noise through a linear generating filter with characteristics formed by real steps of a person, and the primary pulse sequence modulation by Gauss functions. The model permits to control the signal-to-noise ratio after its reduction to unity and to vary pulse shifts with respect to person steps irregularity. It has been shown that the model of a person seismic signal with noise agrees with experimental data.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Lukas Vlcek ◽  
Shize Yang ◽  
Yongji Gong ◽  
Pulickel Ajayan ◽  
Wu Zhou ◽  
...  

AbstractExploration of structure-property relationships as a function of dopant concentration is commonly based on mean field theories for solid solutions. However, such theories that work well for semiconductors tend to fail in materials with strong correlations, either in electronic behavior or chemical segregation. In these cases, the details of atomic arrangements are generally not explored and analyzed. The knowledge of the generative physics and chemistry of the material can obviate this problem, since defect configuration libraries as stochastic representation of atomic level structures can be generated, or parameters of mesoscopic thermodynamic models can be derived. To obtain such information for improved predictions, we use data from atomically resolved microscopic images that visualize complex structural correlations within the system and translate them into statistical mechanical models of structure formation. Given the significant uncertainties about the microscopic aspects of the material’s processing history along with the limited number of available images, we combine model optimization techniques with the principles of statistical hypothesis testing. We demonstrate the approach on data from a series of atomically-resolved scanning transmission electron microscopy images of MoxRe1-xS2 at varying ratios of Mo/Re stoichiometries, for which we propose an effective interaction model that is then used to generate atomic configurations and make testable predictions at a range of concentrations and formation temperatures.


Cancers ◽  
2021 ◽  
Vol 13 (5) ◽  
pp. 1153
Author(s):  
Elysia Racanelli ◽  
Abdulhadi Jfri ◽  
Amnah Gefri ◽  
Elizabeth O’Brien ◽  
Ivan Litvinov ◽  
...  

Background: Cutaneous squamous cell carcinoma (cSCC) is a rare complication of hidradenitis suppurativa (HS). Objectives: To conduct a systematic review and an individual patient data (IPD) meta-analysis to describe the clinical characteristics of HS patients developing cSCC and determine predictors of poor outcome. Methods: Medline/PubMed, Embase, and Web of Science were searched for studies reporting cSCC arising in patients with HS from inception to December 2019. A routine descriptive analysis, statistical hypothesis testing, and Kaplan–Meier survival curves/Cox proportional hazards regression models were performed. Results: A total of 34 case reports and series including 138 patients were included in the study. The majority of patients were males (81.6%), White (83.3%), and smokers (n = 22/27 reported) with a mean age of 53.5 years. Most patients had gluteal (87.8%), Hurley stage 3 HS (88.6%). The mean time from the diagnosis of HS to the development of cSCC was 24.7 years. Human papillomavirus was identified in 12/38 patients tested. Almost 50% of individuals had nodal metastasis and 31.3% had distant metastases. Half of the patients succumbed to their disease. Conclusions: cSCC is a rare but life-threatening complication seen in HS patients, mainly occurring in White males who are smokers with severe, long-standing gluteal HS. Regular clinical examination and biopsy of any suspicious lesions in high-risk patients should be considered. The use of HPV vaccination as a preventive and possibly curative method needs to be explored.


Sign in / Sign up

Export Citation Format

Share Document