VSP wavefield separation: Wave-by-wave optimization approach

Geophysics ◽  
2007 ◽  
Vol 72 (4) ◽  
pp. T47-T55 ◽  
Author(s):  
Emil Blias

Waves propagating across a vertical seismic profiling (VSP) array may be distinguished by their differing arrival times and linear-moveout velocities. Current methods typically assume that the waves propagate uniformly with an unvarying wavelet shape and amplitude. These assumptions break down in the presence of irregular spatial sampling, event truncations, wavelet variations, and noise. I present a new method that allows each event to independently vary in its amplitude and arrival time as it propagates across the array. The method uses an iterative global nonlinear optimization scheme that consists of several least-squares and two eigenvalue problems at each step. Events are stripped from the data one at a time. As stronger events are predicted and removed, weaker events then become visible and can be modeled in turn. As each new event is approximately modeled, the fit for all previously removed events is then revisited and updated. Iterations continue until no remaining coherent events can be distinguished. As VSP data sets are typically not large, the expense of this method is not a significant limitation. I demonstrate with a real-data example that this iterative approach can lead to a significantly better VSP wavefield separation than that which has been available when using conventional techniques.

Geophysics ◽  
2009 ◽  
Vol 74 (4) ◽  
pp. J35-J48 ◽  
Author(s):  
Bernard Giroux ◽  
Abderrezak Bouchedda ◽  
Michel Chouteau

We introduce two new traveltime picking schemes developed specifically for crosshole ground-penetrating radar (GPR) applications. The main objective is to automate, at least partially, the traveltime picking procedure and to provide first-arrival times that are closer in quality to those of manual picking approaches. The first scheme is an adaptation of a method based on cross-correlation of radar traces collated in gathers according to their associated transmitter-receiver angle. A detector is added to isolate the first cycle of the radar wave and to suppress secon-dary arrivals that might be mistaken for first arrivals. To improve the accuracy of the arrival times obtained from the crosscorrelation lags, a time-rescaling scheme is implemented to resize the radar wavelets to a common time-window length. The second method is based on the Akaike information criterion(AIC) and continuous wavelet transform (CWT). It is not tied to the restrictive criterion of waveform similarity that underlies crosscorrelation approaches, which is not guaranteed for traces sorted in common ray-angle gathers. It has the advantage of being automated fully. Performances of the new algorithms are tested with synthetic and real data. In all tests, the approach that adds first-cycle isolation to the original crosscorrelation scheme improves the results. In contrast, the time-rescaling approach brings limited benefits, except when strong dispersion is present in the data. In addition, the performance of crosscorrelation picking schemes degrades for data sets with disparate waveforms despite the high signal-to-noise ratio of the data. In general, the AIC-CWT approach is more versatile and performs well on all data sets. Only with data showing low signal-to-noise ratios is the AIC-CWT superseded by the modified crosscorrelation picker.


Geophysics ◽  
2013 ◽  
Vol 78 (2) ◽  
pp. D85-D91 ◽  
Author(s):  
A. Ahadi ◽  
M. A. Riahi

The aim of designing deconvolution operators is to extract the reflectivity series from seismic sections. Due to the noise, source signature inconsistency, reflection/transmission, anelastic attenuation, and multiples, the amplitude of a propagating seismic wave varies as a function of time. Because of these factors the frequency spectra of seismic signals narrow with time. Recognition of reflectors using upgoing waves is one of the notable properties of vertical seismic profiling (VSP) data. Designing a deconvolution operator for VSP data based on downgoing waves is considered to be one of the most ideal deconvolution methods intended to produce high-resolution images in routine processing of zero-offset VSP data. For such an analysis, the Gabor deconvolution operator has been designed using the downgoing wavefield and then was applied to the upgoing wavefield, and hyperbolic smoothing was used to estimate the wavelet. The final result of applying the deconvolution operator is a VSP section with superior resolution. To compare this method with customary methods of deconvolution, the Wiener deconvolution was applied to the synthetic and real data, and the results were compared with those of the Gabor deconvolution.


Geophysics ◽  
1995 ◽  
Vol 60 (3) ◽  
pp. 692-701 ◽  
Author(s):  
James W. Rector ◽  
Spyros K. Lazaratos ◽  
Jerry M. Harris ◽  
Mark Van Schaack

Using crosswell data collected at a depth of about 3000 ft (900 m) in west Texas carbonates, one of the first well‐to‐well reflection images of an oil reservoir was produced. The P and S brute stack reflection images created after wavefield separation tied the sonic logs and exhibited a vertical resolution that was comparable to well log resolution. Both brute stacks demonstrated continuity of several reflectors known to be continuous from log control and also imaged an angular unconformity that was not detected in log correlations or in surface seismic profiling. The brute stacks, particularly the S‐wave reflection image, also exhibited imaging artifacts. We found that multichannel wavefield separation filters that attenuated interfering wavemodes were a critical component in producing high‐resolution reflection images. In this study, the most important elements for an effective wavefield separation were the time‐alignment of seismic arrivals prior to filter application and the implementation of wavefield‐separation filters in multiple domains, particularly in common offset domain. The effectiveness of the multichannel filtering was enhanced through the use of extremely fine wellbore sampling intervals. In this study, 2.5 ft (0.76 m) vertical sampling intervals for both source and receiver were used, whereas most previous crosswell data sets were collected with much coarser sampling intervals, resulting in spatial aliasing and limiting the utility of the data for reflection processing. The wavefield separation techniques employed in this study used data volumes and associated filtering operations that were several orders of magnitude larger than those encountered in conventional VSP data analysis.


Geophysics ◽  
1995 ◽  
Vol 60 (4) ◽  
pp. 968-977 ◽  
Author(s):  
Rune Mittet ◽  
Ketil Hokstad

Marine walk‐away vertical seismic profiling (VSP) data can be transformed into reverse VSP data using an elastic reciprocity transformation. A reciprocity transform is derived and tested using data generated with a 2-D high‐order, finite‐difference modeling scheme in a complex elastic model. First, 201 shots are generated with a walk‐away VSP experimental configuration. Both the x‐component and the z‐component of the displacement are measured. These data are collected in two common receiver data sets. Then two shots are generated in a reverse VSP configuration. We demonstrate that subtraction of the reverse VSP data from the walk‐away VSP data gives very small residuals. The transformation of walk‐away data into reverse VSP data makes prestack shot‐domain migration feasible for walk‐away data. Synthetic data from a multishot walk‐away experiment can be obtained from one or a few modeling operations with a RVSP experimental configuration. The required computer time is reduced by two orders of magnitude.


Geophysics ◽  
2009 ◽  
Vol 74 (6) ◽  
pp. WCB71-WCB79 ◽  
Author(s):  
Stephan Husen ◽  
Tobias Diehl ◽  
Edi Kissling

Despite the increase in quality and number of seismic stations in many parts of the world, accurate timing of individual arrival times remains crucial for many tomographic applications. To achieve a data set of high quality, arrival times need to be picked with high accuracy, including a proper assessment of the uncertainty of timing and phase identification, and a high level of consistency. We have investigated the effects of data quantity and quality on the solution quality in local earthquake tomography. We have compared tomographic results obtained with synthetic and real data of two very different data sets. The first data set consisted of a large set of arrival times of low precision and unknown accuracy taken from the International Seismological Centre (ISC) Bulletin for the greater Alpine region. The second high-quality data set for the same region was seven times smaller and was obtained by automated quality-weighted repicking. During a first series of inversions, synthetic data resembling the two data sets were inverted with the same amount of Gaussian distributed noise added. Subsequently, during a second series of inversions, the noise level was increased successively for ISC data to study the effect of larger Gaussian distributed error on the solution quality. Finally, the real data for both data sets were inverted. These investigations showed that, for Gaussian distributed error, a smaller data set of high quality could achieve a similar or better solution quality than a data set seven times larger but about four times lower in quality. Our results further suggest that the quality of the ISC Bulletin is degraded significantly by inconsistencies, strongly limiting the use of this large data set for local earthquake tomography studies.


2021 ◽  
Author(s):  
Jakob Raymaekers ◽  
Peter J. Rousseeuw

AbstractMany real data sets contain numerical features (variables) whose distribution is far from normal (Gaussian). Instead, their distribution is often skewed. In order to handle such data it is customary to preprocess the variables to make them more normal. The Box–Cox and Yeo–Johnson transformations are well-known tools for this. However, the standard maximum likelihood estimator of their transformation parameter is highly sensitive to outliers, and will often try to move outliers inward at the expense of the normality of the central part of the data. We propose a modification of these transformations as well as an estimator of the transformation parameter that is robust to outliers, so the transformed data can be approximately normal in the center and a few outliers may deviate from it. It compares favorably to existing techniques in an extensive simulation study and on real data.


Entropy ◽  
2020 ◽  
Vol 23 (1) ◽  
pp. 62
Author(s):  
Zhengwei Liu ◽  
Fukang Zhu

The thinning operators play an important role in the analysis of integer-valued autoregressive models, and the most widely used is the binomial thinning. Inspired by the theory about extended Pascal triangles, a new thinning operator named extended binomial is introduced, which is a general case of the binomial thinning. Compared to the binomial thinning operator, the extended binomial thinning operator has two parameters and is more flexible in modeling. Based on the proposed operator, a new integer-valued autoregressive model is introduced, which can accurately and flexibly capture the dispersed features of counting time series. Two-step conditional least squares (CLS) estimation is investigated for the innovation-free case and the conditional maximum likelihood estimation is also discussed. We have also obtained the asymptotic property of the two-step CLS estimator. Finally, three overdispersed or underdispersed real data sets are considered to illustrate a superior performance of the proposed model.


Econometrics ◽  
2021 ◽  
Vol 9 (1) ◽  
pp. 10
Author(s):  
Šárka Hudecová ◽  
Marie Hušková ◽  
Simos G. Meintanis

This article considers goodness-of-fit tests for bivariate INAR and bivariate Poisson autoregression models. The test statistics are based on an L2-type distance between two estimators of the probability generating function of the observations: one being entirely nonparametric and the second one being semiparametric computed under the corresponding null hypothesis. The asymptotic distribution of the proposed tests statistics both under the null hypotheses as well as under alternatives is derived and consistency is proved. The case of testing bivariate generalized Poisson autoregression and extension of the methods to dimension higher than two are also discussed. The finite-sample performance of a parametric bootstrap version of the tests is illustrated via a series of Monte Carlo experiments. The article concludes with applications on real data sets and discussion.


Information ◽  
2021 ◽  
Vol 12 (5) ◽  
pp. 202
Author(s):  
Louai Alarabi ◽  
Saleh Basalamah ◽  
Abdeltawab Hendawi ◽  
Mohammed Abdalla

The rapid spread of infectious diseases is a major public health problem. Recent developments in fighting these diseases have heightened the need for a contact tracing process. Contact tracing can be considered an ideal method for controlling the transmission of infectious diseases. The result of the contact tracing process is performing diagnostic tests, treating for suspected cases or self-isolation, and then treating for infected persons; this eventually results in limiting the spread of diseases. This paper proposes a technique named TraceAll that traces all contacts exposed to the infected patient and produces a list of these contacts to be considered potentially infected patients. Initially, it considers the infected patient as the querying user and starts to fetch the contacts exposed to him. Secondly, it obtains all the trajectories that belong to the objects moved nearby the querying user. Next, it investigates these trajectories by considering the social distance and exposure period to identify if these objects have become infected or not. The experimental evaluation of the proposed technique with real data sets illustrates the effectiveness of this solution. Comparative analysis experiments confirm that TraceAll outperforms baseline methods by 40% regarding the efficiency of answering contact tracing queries.


Symmetry ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 474
Author(s):  
Abdulhakim A. Al-Babtain ◽  
Ibrahim Elbatal ◽  
Hazem Al-Mofleh ◽  
Ahmed M. Gemeay ◽  
Ahmed Z. Afify ◽  
...  

In this paper, we introduce a new flexible generator of continuous distributions called the transmuted Burr X-G (TBX-G) family to extend and increase the flexibility of the Burr X generator. The general statistical properties of the TBX-G family are calculated. One special sub-model, TBX-exponential distribution, is studied in detail. We discuss eight estimation approaches to estimating the TBX-exponential parameters, and numerical simulations are conducted to compare the suggested approaches based on partial and overall ranks. Based on our study, the Anderson–Darling estimators are recommended to estimate the TBX-exponential parameters. Using two skewed real data sets from the engineering sciences, we illustrate the importance and flexibility of the TBX-exponential model compared with other existing competing distributions.


Sign in / Sign up

Export Citation Format

Share Document