Sampling two‐dimensional seismic data and their Radon transforms

Geophysics ◽  
1989 ◽  
Vol 54 (10) ◽  
pp. 1318-1325 ◽  
Author(s):  
Virgil Bardan

2‐D seismic data are usually sampled and processed in a rectangular grid, for which sampling requirements are generally derived from the usual 1‐D viewpoint. For a 2‐D seismic data set, the band region (the region of the Fourier plane in which the amplitude spectrum exceeds some very small number) can be approximated by a domain bounded by two triangles. Considering the particular shape of this band region, I use 2‐D sampling theory to obtain results applicable to seismic data processing. The 2‐D viewpoint leads naturally to weaker sampling requirements than does the 1‐D viewpoint; i.e., fewer sample points are needed to represent data with the same degree of accuracy. The sampling of 2‐D seismic data and of their Radon transform in a parallelogram and then in a triangular grid is introduced. The triangular sampling grid is optimal in these cases, since it requires the minimum number of sample points—equal to half the number required by a parallelogram or rectangular grid. The sampling of 2‐D seismic data in a triangular grid is illustrated by examples of synthetic and field seismic sections. The properties of parallelogram grid sampling impose an additional sampling requirement on the 2‐D seismic data in order to evaluate their Radon transform numerically; i.e., the maximum value of the spatial sampling interval must be half of that required by the sampling theorem.

Geophysics ◽  
2013 ◽  
Vol 78 (4) ◽  
pp. U41-U51 ◽  
Author(s):  
Jingwei Hu ◽  
Sergey Fomel ◽  
Laurent Demanet ◽  
Lexing Ying

Generalized Radon transforms, such as the hyperbolic Radon transform, cannot be implemented as efficiently in the frequency domain as convolutions, thus limiting their use in seismic data processing. We have devised a fast butterfly algorithm for the hyperbolic Radon transform. The basic idea is to reformulate the transform as an oscillatory integral operator and to construct a blockwise low-rank approximation of the kernel function. The overall structure follows the Fourier integral operator butterfly algorithm. For 2D data, the algorithm runs in complexity [Formula: see text], where [Formula: see text] depends on the maximum frequency and offset in the data set and the range of parameters (intercept time and slowness) in the model space. From a series of studies, we found that this algorithm can be significantly more efficient than the conventional time-domain integration.


Geophysics ◽  
2018 ◽  
Vol 83 (1) ◽  
pp. V39-V48 ◽  
Author(s):  
Ali Gholami ◽  
Toktam Zand

The focusing power of the conventional hyperbolic Radon transform decreases for long-offset seismic data due to the nonhyperbolic behavior of moveout curves at far offsets. Furthermore, conventional Radon transforms are ineffective for processing data sets containing events of different shapes. The shifted hyperbola is a flexible three-parameter (zero-offset traveltime, slowness, and focusing-depth) function, which is capable of generating linear and hyperbolic shapes and improves the accuracy of the seismic traveltime approximation at far offsets. Radon transform based on shifted hyperbolas thus improves the focus of seismic events in the transform domain. We have developed a new method for effective decomposition of seismic data by using such three-parameter Radon transform. A very fast algorithm is constructed for high-resolution calculations of the new Radon transform using the recently proposed generalized Fourier slice theorem (GFST). The GFST establishes an analytic expression between the [Formula: see text] coefficients of the data and the [Formula: see text] coefficients of its Radon transform, with which a very fast switching between the model and data spaces is possible by means of interpolation procedures and fast Fourier transforms. High performance of the new algorithm is demonstrated on synthetic and real data sets for trace interpolation and linear (ground roll) noise attenuation.


Geophysics ◽  
2003 ◽  
Vol 68 (1) ◽  
pp. 337-345 ◽  
Author(s):  
Yanghua Wang

Applying inverse Q filtering to surface seismic data may minimize the effect of dispersion and attenuation and hence improve the seismic resolution. In this case study, a stabilized inverse Q filter is applied to a land seismic data set, for which the prerequisite reliable earth Q function is estimated from the vertical seismic profile (VSP) downgoing wavefield. The paper focuses on the robust estimate of Q values from VSP data and on the quantitative evaluation of the effectiveness of the stabilized inverse Q filtering approach. The quantitative evaluation shows that inverse Q filtering may flatten the amplitude spectrum, strengthen the time‐variant amplitude, increase the spectral bandwidth, and improve the signal‐to‐noise (S/N) ratio. A parameter measuring the resolution enhancement is defined as a function of the changes in the bandwidth and the S/N ratio. The stabilized inverse Q filtering algorithm, which may provide a stable solution for compensating the high‐frequency wave components lost through attenuation, has positive changes in both the bandwidth and the S/N ratio, and thereby enhances the resolution of the final processed seismic data.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


Geophysics ◽  
2016 ◽  
Vol 81 (6) ◽  
pp. A17-A21 ◽  
Author(s):  
Juan I. Sabbione ◽  
Mauricio D. Sacchi

The coefficients that synthesize seismic data via the hyperbolic Radon transform (HRT) are estimated by solving a linear-inverse problem. In the classical HRT, the computational cost of the inverse problem is proportional to the size of the data and the number of Radon coefficients. We have developed a strategy that significantly speeds up the implementation of time-domain HRTs. For this purpose, we have defined a restricted model space of coefficients applying hard thresholding to an initial low-resolution Radon gather. Then, an iterative solver that operated on the restricted model space was used to estimate the group of coefficients that synthesized the data. The method is illustrated with synthetic data and tested with a marine data example.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. C81-C92 ◽  
Author(s):  
Helene Hafslund Veire ◽  
Hilde Grude Borgos ◽  
Martin Landrø

Effects of pressure and fluid saturation can have the same degree of impact on seismic amplitudes and differential traveltimes in the reservoir interval; thus, they are often inseparable by analysis of a single stacked seismic data set. In such cases, time-lapse AVO analysis offers an opportunity to discriminate between the two effects. We quantify the uncertainty in estimations to utilize information about pressure- and saturation-related changes in reservoir modeling and simulation. One way of analyzing uncertainties is to formulate the problem in a Bayesian framework. Here, the solution of the problem will be represented by a probability density function (PDF), providing estimations of uncertainties as well as direct estimations of the properties. A stochastic model for estimation of pressure and saturation changes from time-lapse seismic AVO data is investigated within a Bayesian framework. Well-known rock physical relationships are used to set up a prior stochastic model. PP reflection coefficient differences are used to establish a likelihood model for linking reservoir variables and time-lapse seismic data. The methodology incorporates correlation between different variables of the model as well as spatial dependencies for each of the variables. In addition, information about possible bottlenecks causing large uncertainties in the estimations can be identified through sensitivity analysis of the system. The method has been tested on 1D synthetic data and on field time-lapse seismic AVO data from the Gullfaks Field in the North Sea.


Geophysics ◽  
2015 ◽  
Vol 80 (5) ◽  
pp. B115-B129 ◽  
Author(s):  
Rie Kamei ◽  
Takayuki Miyoshi ◽  
R. Gerhard Pratt ◽  
Mamoru Takanashi ◽  
Shogo Masaya

2014 ◽  
Vol 672-674 ◽  
pp. 1964-1967
Author(s):  
Jun Qiu Wang ◽  
Jun Lin ◽  
Xiang Bo Gong

Vibroseis obtained the seismic record by cross-correlation detection calculation. compared with dynamite source, cross-correlation detection can suppress random noise, but produce more correlation noise. This paper studies Radon transform to remove correlation noise produced by electromagnetic drive vibroseis and impact rammer. From the results of processing field seismic records, we can see that Radon transform can remove correlation noise by vibroseis, the SNR of vibroseis seismic data is effectively improved.


Geophysics ◽  
2017 ◽  
Vol 82 (3) ◽  
pp. R199-R217 ◽  
Author(s):  
Xintao Chai ◽  
Shangxu Wang ◽  
Genyang Tang

Seismic data are nonstationary due to subsurface anelastic attenuation and dispersion effects. These effects, also referred to as the earth’s [Formula: see text]-filtering effects, can diminish seismic resolution. We previously developed a method of nonstationary sparse reflectivity inversion (NSRI) for resolution enhancement, which avoids the intrinsic instability associated with inverse [Formula: see text] filtering and generates superior [Formula: see text] compensation results. Applying NSRI to data sets that contain multiples (addressing surface-related multiples only) requires a demultiple preprocessing step because NSRI cannot distinguish primaries from multiples and will treat them as interference convolved with incorrect [Formula: see text] values. However, multiples contain information about subsurface properties. To use information carried by multiples, with the feedback model and NSRI theory, we adapt NSRI to the context of nonstationary seismic data with surface-related multiples. Consequently, not only are the benefits of NSRI (e.g., circumventing the intrinsic instability associated with inverse [Formula: see text] filtering) extended, but also multiples are considered. Our method is limited to be a 1D implementation. Theoretical and numerical analyses verify that given a wavelet, the input [Formula: see text] values primarily affect the inverted reflectivities and exert little effect on the estimated multiples; i.e., multiple estimation need not consider [Formula: see text] filtering effects explicitly. However, there are benefits for NSRI considering multiples. The periodicity and amplitude of the multiples imply the position of the reflectivities and amplitude of the wavelet. Multiples assist in overcoming scaling and shifting ambiguities of conventional problems in which multiples are not considered. Experiments using a 1D algorithm on a synthetic data set, the publicly available Pluto 1.5 data set, and a marine data set support the aforementioned findings and reveal the stability, capabilities, and limitations of the proposed method.


Author(s):  
A. Ogbamikhumi ◽  
T. Tralagba ◽  
E. E. Osagiede

Field ‘K’ is a mature field in the coastal swamp onshore Niger delta, which has been producing since 1960. As a huge producing field with some potential for further sustainable production, field monitoring is therefore important in the identification of areas of unproduced hydrocarbon. This can be achieved by comparing production data with the corresponding changes in acoustic impedance observed in the maps generated from base survey (initial 3D seismic) and monitor seismic survey (4D seismic) across the field. This will enable the 4D seismic data set to be used for mapping reservoir details such as advancing water front and un-swept zones. The availability of good quality onshore time-lapse seismic data for Field ‘K’ acquired in 1987 and 2002 provided the opportunity to evaluate the effect of changes in reservoir fluid saturations on time-lapse amplitudes. Rock physics modelling and fluid substitution studies on well logs were carried out, and acoustic impedance change in the reservoir was estimated to be in the range of 0.25% to about 8%. Changes in reservoir fluid saturations were confirmed with time-lapse amplitudes within the crest area of the reservoir structure where reservoir porosity is 0.25%. In this paper, we demonstrated the use of repeat Seismic to delineate swept zones and areas hit with water override in a producing onshore reservoir.


Sign in / Sign up

Export Citation Format

Share Document