Five-dimensional interpolation: Recovering from acquisition constraints

Geophysics ◽  
2009 ◽  
Vol 74 (6) ◽  
pp. V123-V132 ◽  
Author(s):  
Daniel Trad

Although 3D seismic data are being acquired in larger volumes than ever before, the spatial sampling of these volumes is not always adequate for certain seismic processes. This is especially true of marine and land wide-azimuth acquisitions, leading to the development of multidimensional data interpolation techniques. Simultaneous interpolation in all five seismic data dimensions (inline, crossline, offset, azimuth, and frequency) has great utility in predicting missing data with correct amplitude and phase variations. Although there are many techniques that can be implemented in five dimensions, this study focused on sparse Fourier reconstruction. The success of Fourier interpolation methods depends largely on two factors: (1) having efficient Fourier transform operators that permit the use of large multidimensional data windows and (2) constraining the spatial spectrum along dimensions where seismic amplitudes change slowly so that the sparseness and band limitation assumptions remain valid. Fourier reconstruction can be performed when enforcing a sparseness constraint on the 4D spatial spectrum obtained from frequency slices of five-dimensional windows. Binning spatial positions into a fine 4D grid facilitates the use of the FFT, which helps on the convergence of the inversion algorithm. This improves the results and computational efficiency. The 5D interpolation can successfully interpolate sparse data, improve AVO analysis, and reduce migration artifacts. Target geometries for optimal interpolation and regularization of land data can be classified in terms of whether they preserve the original data and whether they are designed to achieve surface or subsurface consistency.

Geophysics ◽  
2010 ◽  
Vol 75 (6) ◽  
pp. WB103-WB111 ◽  
Author(s):  
Side Jin

Regularizing inadequate and irregularly sampled seismic data is one of important problems in seismic data processing. An improvement to existing methods to solve this problem is proposed by applying a 5D regularization/interpolation scheme with a damped least-norm Fourier inversion. Under the assumption of planar seismic events within small data windows, the spatial spectrum of regularized data for a fixed frequency should be sparse and have minimum damped norm. The inversion scheme consists in finding a set of regularly spaced spatial Fourier coefficients by minimizing its damped norm for each frequency, subject to the condition that the resulting spatial Fourier coefficients also faithfully reconstruct the original data. The damping factors are automatically derived from the amplitude spectra of the regularized low-frequency data. With the guidance of the damping factors and automatic adjustment of wavenumber ranges according to the Nyquist sampling theory, the proposed inversion algorithm naturally yields a one-step solution for both stabilization and antialiasing of the interpolation problem. A distinctive feature of the method is that it uses high-dimensional nonuniform fast Fourier transforms to evaluate expensive discrete Fourier transforms involved in conjugate gradient iterations. This improves the computational efficiency. The results of applying this algorithm to synthetic and field data demonstrate that it performs well when applied to highly irregular data and outperforms lower dimensional interpolation schemes.


2021 ◽  
Vol 11 (11) ◽  
pp. 4874
Author(s):  
Milan Brankovic ◽  
Eduardo Gildin ◽  
Richard L. Gibson ◽  
Mark E. Everett

Seismic data provides integral information in geophysical exploration, for locating hydrocarbon rich areas as well as for fracture monitoring during well stimulation. Because of its high frequency acquisition rate and dense spatial sampling, distributed acoustic sensing (DAS) has seen increasing application in microseimic monitoring. Given large volumes of data to be analyzed in real-time and impractical memory and storage requirements, fast compression and accurate interpretation methods are necessary for real-time monitoring campaigns using DAS. In response to the developments in data acquisition, we have created shifted-matrix decomposition (SMD) to compress seismic data by storing it into pairs of singular vectors coupled with shift vectors. This is achieved by shifting the columns of a matrix of seismic data before applying singular value decomposition (SVD) to it to extract a pair of singular vectors. The purpose of SMD is data denoising as well as compression, as reconstructing seismic data from its compressed form creates a denoised version of the original data. By analyzing the data in its compressed form, we can also run signal detection and velocity estimation analysis. Therefore, the developed algorithm can simultaneously compress and denoise seismic data while also analyzing compressed data to estimate signal presence and wave velocities. To show its efficiency, we compare SMD to local SVD and structure-oriented SVD, which are similar SVD-based methods used only for denoising seismic data. While the development of SMD is motivated by the increasing use of DAS, SMD can be applied to any seismic data obtained from a large number of receivers. For example, here we present initial applications of SMD to readily available marine seismic data.


2020 ◽  
Vol 91 (4) ◽  
pp. 2127-2140 ◽  
Author(s):  
Glenn Thompson ◽  
John A. Power ◽  
Jochen Braunmiller ◽  
Andrew B. Lockhart ◽  
Lloyd Lynch ◽  
...  

Abstract An eruption of the Soufrière Hills Volcano (SHV) on the eastern Caribbean island of Montserrat began on 18 July 1995 and continued until February 2010. Within nine days of the eruption onset, an existing four-station analog seismic network (ASN) was expanded to 10 sites. Telemetered data from this network were recorded, processed, and archived locally using a system developed by scientists from the U.S. Geological Survey (USGS) Volcano Disaster Assistance Program (VDAP). In October 1996, a digital seismic network (DSN) was deployed with the ability to capture larger amplitude signals across a broader frequency range. These two networks operated in parallel until December 2004, with separate telemetry and acquisition systems (analysis systems were merged in March 2001). Although the DSN provided better quality data for research, the ASN featured superior real-time monitoring tools and captured valuable data including the only seismic data from the first 15 months of the eruption. These successes of the ASN have been rather overlooked. This article documents the evolution of the ASN, the VDAP system, the original data captured, and the recovery and conversion of more than 230,000 seismic events from legacy SUDS, Hypo71, and Seislog formats into Seisan database with waveform data in miniSEED format. No digital catalog existed for these events, but students at the University of South Florida have classified two-thirds of the 40,000 events that were captured between July 1995 and October 1996. Locations and magnitudes were recovered for ∼10,000 of these events. Real-time seismic amplitude measurement, seismic spectral amplitude measurement, and tiltmeter data were also captured. The result is that the ASN seismic dataset is now more discoverable, accessible, and reusable, in accordance with FAIR data principles. These efforts could catalyze new research on the 1995–2010 SHV eruption. Furthermore, many observatories have data in these same legacy data formats and might benefit from procedures and codes documented here.


Geophysics ◽  
2021 ◽  
pp. 1-97
Author(s):  
Dawei Liu ◽  
Lei Gao ◽  
Xiaokai Wang ◽  
wenchao Chen

Acquisition footprint causes serious interference with seismic attribute analysis, which severely hinders accurate reservoir characterization. Therefore, acquisition footprint suppression has become increasingly important in industry and academia. In this work, we assume that the time slice of 3D post-stack migration seismic data mainly comprises two components, i.e., useful signals and acquisition footprint. Useful signals describe the spatial distributions of geological structures with local piecewise smooth morphological features. However, acquisition footprint often behaves as periodic artifacts in the time-slice domain. In particular, the local morphological features of the acquisition footprint in the marine seismic acquisition appear as stripes. As useful signals and acquisition footprint have different morphological features, we can train an adaptive dictionary and divide the atoms of the dictionary into two sub-dictionaries to reconstruct these two components. We propose an adaptive dictionary learning method for acquisition footprint suppression in the time slice of 3D post-stack migration seismic data. To obtain an adaptive dictionary, we use the K-singular value decomposition algorithm to sparsely represent the patches in the time slice of 3D post-stack migration seismic data. Each atom of the trained dictionary represents certain local morphological features of the time slice. According to the difference in the variation level between the horizontal and vertical directions, the atoms of the trained dictionary are divided into two types. One type significantly represents the local morphological features of the acquisition footprint, whereas the other type represents the local morphological features of useful signals. Then, these two components are reconstructed using morphological component analysis based on different types of atoms, respectively. Synthetic and field data examples indicate that the proposed method can effectively suppress the acquisition footprint with fidelity to the original data.


2019 ◽  
Vol 24 (2) ◽  
pp. 201-214
Author(s):  
Rashed Poormirzaee ◽  
Siamak Sarmady ◽  
Yusuf Sharghi

Similar to any other geophysical method, seismic refraction method faces non-uniqueness in the estimation of model parameters. Recently, different nonlinear seismic processing techniques have been introduced, particularly for seismic inversion. One of the recently developed metaheuristic algorithms is bat optimization algorithm (BA). Standard BA is usually quick at the exploitation of the solution, while its exploration ability is relatively poor. In order to improve exploration ability of BA, in the current study, a hybrid metaheuristic algorithm by inclusion a mutation operator into BA, so-called mutation based bat algorithm (MBA), is introduced to inversion of seismic refraction data. The efficiency and stability of the proposed inversion algorithm were tested on different synthetic cases. Finally, the MBA inversion algorithm was applied to a real dataset acquired from Leylanchay dam site at East-Azerbaijan province, Iran, to determine alluvium depth. Then, the performance of MBA on both synthetic and real datasets was compared with standard BA. Moreover, the dataset was further processed following a tomographic approach and the results were compared to the results of the proposed MBA inversion method. In general, the MBA inversion results were superior to standard BA inversion and results of MBA were in good agreement with available boreholes data and geological sections at the dam site. The analysis of the seismic data showed that the studied site comprises three distinct layers: a saturated alluvial, an unsaturated alluvial, and a dolomite bedrock. The measured seismic velocity across the dam site has a range of 400 to 3,500 m/s, with alluvium thickness ranging from 5 to 19 m. Findings showed that the proposed metaheuristic inversion framework is a simple, fast, and powerful tool for seismic data processing.


Geophysics ◽  
1994 ◽  
Vol 59 (11) ◽  
pp. 1763-1773 ◽  
Author(s):  
Hans J. Tieman

Reflection seismic data contain a long wavelength ambiguity making it difficult to separate traveltime information into velocity and reflector depth components. The existence of this velocity‐depth ambiguity is a feature of the geometry of the subsurface and is not caused by the particular inversion algorithm being used. Factors that control the occurrence of velocity‐depth ambiguities include the effective width of a potential velocity anomaly; i.e., its spatial wavelength, its height above a reflector, and its thickness. Factors that do not affect velocity‐depth ambiguities are the magnitude of the anomaly (the difference in velocity between it and the background) and the cable length with which data were recorded. A thin velocity anomaly induces an ambiguity at a wavelength approximately equal to 4.44 times the height of the anomaly above the reflector. A thick anomaly that spans the entire space from surface to reflector induces an ambiguity at a wavelength approximately equal to 2.57 the depth to the reflector. These are wavelengths that are significant in size, and therefore are of exploration interest. Through Fourier analysis, any subsurface velocity field can be decomposed into spatial frequency components. Thus the wavelength dependent velocity‐depth ambiguity adversely affects all velocity distributions.


Geophysics ◽  
2015 ◽  
Vol 80 (1) ◽  
pp. R31-R41 ◽  
Author(s):  
Andrea Zunino ◽  
Klaus Mosegaard ◽  
Katrine Lange ◽  
Yulia Melnikova ◽  
Thomas Mejer Hansen

Determination of a petroleum reservoir structure and rock bulk properties relies extensively on inference from reflection seismology. However, classic deterministic methods to invert seismic data for reservoir properties suffer from some limitations, among which are the difficulty of handling complex, possibly nonlinear forward models, and the lack of robust uncertainty estimations. To overcome these limitations, we studied a methodology to invert seismic reflection data in the framework of the probabilistic approach to inverse problems, using a Markov chain Monte Carlo (McMC) algorithm with the goal to directly infer the rock facies and porosity of a target reservoir zone. We thus combined a rock-physics model with seismic data in a single inversion algorithm. For large data sets, the McMC method may become computationally impractical, so we relied on multiple-point-based a priori information to quantify geologically plausible models. We tested this methodology on a synthetic reservoir model. The solution of the inverse problem was then represented by a collection of facies and porosity reservoir models, which were samples of the posterior distribution. The final product included probability maps of the reservoir properties in obtained by performing statistical analysis on the collection of solutions.


Geophysics ◽  
2020 ◽  
Vol 85 (5) ◽  
pp. V407-V414
Author(s):  
Yanghua Wang ◽  
Xiwu Liu ◽  
Fengxia Gao ◽  
Ying Rao

The 3D seismic data in the prestack domain are contaminated by impulse noise. We have adopted a robust vector median filter (VMF) for attenuating the impulse noise from 3D seismic data cubes. The proposed filter has two attractive features. First, it is robust; the vector median that is the output of the filter not only has a minimum distance to all input data vectors, but it also has a high similarity to the original data vector. Second, it is structure adaptive; the filter is implemented following the local structure of coherent seismic events. The application of the robust and structure-adaptive VMF is demonstrated using an example data set acquired from an area with strong sedimentary rhythmites composed of steep-dipping thin layers. This robust filter significantly improves the signal-to-noise ratio of seismic data while preserving any discontinuity of reflections and maintaining the fidelity of amplitudes, which will facilitate the reservoir characterization that follows.


Sign in / Sign up

Export Citation Format

Share Document