Refocusing migrated seismic images in absorptive media

Geophysics ◽  
2010 ◽  
Vol 75 (3) ◽  
pp. S103-S110 ◽  
Author(s):  
Changjun Zhang ◽  
Tadeusz J. Ulrych

In seismic exploration, received seismic signals usually experience absorption during their propagation. However, seismic migration algorithms seldom take into account seismic absorption in their implementations. We have investigated the blurring effect in migrated images that occurs when using a regular migration algorithm to migrate those seismic data with absorption. The blurring functions can be calculated using a numerical method; and for layered media, a fast algorithm exists for updating the blurring function from one time step to another. The deblurring process is formulated as a problem of multidimensional nonstationary deconvolution. We use a least-squares inverse scheme to remove the absorption blurring effect and in turn refocus migrated images. The refocusing algorithm is stable, and convergence is achieved with a few iterations at each wavenumber. Experiments on synthetic and real data show that our refocusing technique is valid when compensating for seismic absorption after migration.

Geophysics ◽  
2020 ◽  
Vol 85 (2) ◽  
pp. V223-V232 ◽  
Author(s):  
Zhicheng Geng ◽  
Xinming Wu ◽  
Sergey Fomel ◽  
Yangkang Chen

The seislet transform uses the wavelet-lifting scheme and local slopes to analyze the seismic data. In its definition, the designing of prediction operators specifically for seismic images and data is an important issue. We have developed a new formulation of the seislet transform based on the relative time (RT) attribute. This method uses the RT volume to construct multiscale prediction operators. With the new prediction operators, the seislet transform gets accelerated because distant traces get predicted directly. We apply our method to synthetic and real data to demonstrate that the new approach reduces computational cost and obtains excellent sparse representation on test data sets.


Geophysics ◽  
2003 ◽  
Vol 68 (1) ◽  
pp. 225-231 ◽  
Author(s):  
Rongfeng Zhang ◽  
Tadeusz J. Ulrych

This paper deals with the design and implementation of a new wavelet frame for noise suppression based on the character of seismic data. In general, wavelet denoising methods widely used in image and acoustic processing use well‐known conventional wavelets which, although versatile, are often not optimal for seismic data. The new approach, physical wavelet frame denoising uses a wavelet frame that takes into account the characteristics of seismic data both in time and space. Synthetic and real data tests show that the approach is effective even for seismic signals contaminated by strong noise which may be random or coherent, such as ground roll or air waves.


Geophysics ◽  
1999 ◽  
Vol 64 (5) ◽  
pp. 1630-1636 ◽  
Author(s):  
Ayon K. Dey ◽  
Larry R. Lines

In seismic exploration, statistical wavelet estimation and deconvolution are standard tools. Both of these processes assume randomness in the seismic reflectivity sequence. The validity of this assumption is examined by using well‐log synthetic seismograms and by using a procedure for evaluating the resulting deconvolutions. With real data, we compare our wavelet estimations with the in‐situ recording of the wavelet from a vertical seismic profile (VSP). As a result of our examination of the randomness assumption, we present a fairly simple test that can be used to evaluate the validity of a randomness assumption. From our test of seismic data in Alberta, we conclude that the assumption of reflectivity randomness is less of a problem in deconvolution than other assumptions such as phase and stationarity.


2020 ◽  
Vol 39 (10) ◽  
pp. 711-717
Author(s):  
Mehdi Aharchaou ◽  
Michael Matheney ◽  
Joe Molyneux ◽  
Erik Neumann

Recent demands to reduce turnaround times and expedite investment decisions in seismic exploration have invited new ways to process and interpret seismic data. Among these ways is a more integrated collaboration between seismic processors and geologist interpreters aiming to build preliminary geologic models for early business impact. A key aspect has been quick and streamlined delivery of clean high-fidelity 3D seismic images via postmigration filtering capabilities. We present a machine learning-based example of such a capability built on recent advances in deep learning systems. In particular, we leverage the power of Siamese neural networks, a new class of neural networks that is powerful at learning discriminative features. Our novel adaptation, edge-aware filtering, employs a deep Siamese network that ranks similarity between seismic image patches. Once the network is trained, we capitalize on the learned features and self-similarity property of seismic images to achieve within-image stacking power endowed with edge awareness. The method generalizes well to new data sets due to the few-shot learning ability of Siamese networks. Furthermore, the learning-based framework can be extended to a variety of noise types in 3D seismic data. Using a convolutional architecture, we demonstrate on three field data sets that the learned representations lead to superior filtering performance compared to structure-oriented filtering. We examine both filtering quality and ease of application in our analysis. Then, we discuss the potential of edge-aware filtering as a data conditioning tool for rapid structural interpretation.


2014 ◽  
Vol 490-491 ◽  
pp. 1356-1360 ◽  
Author(s):  
Shu Cong Liu ◽  
Er Gen Gao ◽  
Chen Xun

The wavelet packet transform is a new time-frequency analysis method, and is superior to the traditional wavelet transform and Fourier transform, which can finely do time-frequency dividion on seismic data. A series of simulation experiments on analog seismic signals wavelet packet decomposition and reconstruction at different scales were done by combining different noisy seismic signals, in order to achieve noise removal at optimal wavelet decomposition scale. Simulation results and real data experiments showed that the wavelet packet transform method can effectively remove the noise in seismic signals and retain the valid signals, wavelet packet transform denoising is very effective.


Water ◽  
2020 ◽  
Vol 12 (6) ◽  
pp. 1765
Author(s):  
Wei Xin ◽  
Fei Tian ◽  
Xiaocai Shan ◽  
Yongjian Zhou ◽  
Huazhong Rong ◽  
...  

As deep carbonate fracture-cavity paleokarst reservoirs are deeply buried and highly heterogeneous, and the responded seismic signals have weak amplitudes and low signal-to-noise ratios. Machine learning in seismic exploration provides a new perspective to solve the above problems, which is rapidly developing with compelling results. Applying machine learning algorithms directly on deep seismic signals or seismic attributes of deep carbonate fracture-cavity reservoirs without any prior knowledge constraints will result in wasted computation and reduce the accuracy. We propose a method of combining geological constraints and machine learning to describe deep carbonate fracture-cavity paleokarst reservoirs. By empirical mode decomposition, the time–frequency features of the seismic data are obtained and then a sensitive frequency is selected using geological prior constraints, which is input to fuzzy C-means cluster for characterizing the reservoir distribution. Application on Tahe oilfield data shows the potential of highlighting subtle geologic structures that might otherwise escape unnoticed by applying machine learning directly.


Geophysics ◽  
1983 ◽  
Vol 48 (12) ◽  
pp. 1598-1610 ◽  
Author(s):  
J. Bee Bednar

Seismic exploration problems frequently require analysis of noisy data. Traditional processing removes or reduces noise effects by linear statistical filtering. This filtering process can be viewed as a weighted averaging with coefficients chosen to enhance the data information content. When the signal and noise components occupy separate spectral windows, or when the statistical properties of the noise are sufficiently understood, linear statistical filtering is an effective tool for data enhancement. When the noise properties are not well understood, or when the noise and signal occupy the same spectral window, linear or weighted averaging performs poorly as a signal enhancement process. One must look for alternative procedures to extract the desired information. As a nonlinear operation which is statistically similar to averaging, median filtering represents one potential alternative. This paper investigates the application of median filtering to several seismic data enhancement problems. A methodology for using median filtering as one step in cepstral deconvolution or seismic signature estimation is presented. The median filtering process is applied to statistical editing of acoustic impedance data and the removal of noise bursts from reflection data. The most surprising conclusion obtained from the empirical studies on synthetic data is that, in high‐noise situations, cepstral‐based median filtering appears to perform exceptionally well as a deconvolver but poorly as a signature estimator. For real data, the process is stable and, to the extent that the data follow the convolutional model, does a reasonable job at both pulse estimation and deconvolution.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


2021 ◽  
pp. 1-29
Author(s):  
Papia Nandi ◽  
Patrick Fulton ◽  
James Dale

As rising ocean temperatures can destabilize gas hydrate, identifying and characterizing large shallow hydrate bodies is increasingly important in order to understand their hazard potential. In the southwestern Gulf of Mexico, reanalysis of 3D seismic reflection data reveals evidence for the presence of six potentially large gas hydrate bodies located at shallow depths below the seafloor. We originally interpreted these bodies as salt, as they share common visual characteristics on seismic data with shallow allochthonous salt bodies, including high-impedance boundaries and homogenous interiors with very little acoustic reflectivity. However, when seismic images are constructed using acoustic velocities associated with salt, the resulting images were of poor quality containing excessive moveout in common reflection point (CRP) offset image gathers. Further investigation reveals that using lower-valued acoustic velocities results in higher quality images with little or no moveout. We believe that these lower acoustic values are representative of gas hydrate and not of salt. Directly underneath these bodies lies a zone of poor reflectivity, which is both typical and expected under hydrate. Observations of gas in a nearby well, other indicators of hydrate in the vicinity, and regional geologic context, all support the interpretation that these large bodies are composed of hydrate. The total equivalent volume of gas within these bodies is estimated to potentially be as large as 1.5 gigatons or 10.5 TCF, considering uncertainty for estimates of porosity and saturation, comparable to the entire proven natural gas reserves of Trinidad and Tobago in 2019.


2021 ◽  
Author(s):  
Donglin Zhu ◽  
Lei Li ◽  
Rui Guo ◽  
Shifan Zhan

Abstract Fault detection is an important, but time-consuming task in seismic data interpretation. Traditionally, seismic attributes, such as coherency (Marfurt et al., 1998) and curvature (Al-Dossary et al., 2006) are used to detect faults. Recently, machine learning methods, such as convolution neural networks (CNNs) are used to detect faults, by applying various semantic segmentation algorithms to the seismic data (Wu et al., 2019). The most used algorithm is U-Net (Ronneberger et al., 2015), which can accurately and efficiently provide probability maps of faults. However, probabilities of faults generated by semantic segmentation algorithms are not sufficient for direct recognition of fault types and reconstruction of fault surfaces. To address this problem, we propose, for the first time, a workflow to use instance segmentation algorithm to detect different fault lines. Specifically, a modified CNN (LaneNet; Neven et al., 2018) is trained using automatically generated synthetic seismic images and corresponding labels. We then test the trained CNN using both synthetic and field collected seismic data. Results indicate that the proposed workflow is accurate and effective at detecting faults.


Sign in / Sign up

Export Citation Format

Share Document