Four‐component surface‐consistent deconvolution

Geophysics ◽  
1993 ◽  
Vol 58 (3) ◽  
pp. 383-392 ◽  
Author(s):  
Peter W. Cary ◽  
Gary A. Lorentz

When performing four‐component surface‐consistent deconvolution, it is assumed that the decomposition of amplitude spectra into source, receiver, offset, and common‐depth‐point components enables accurate deconvolution filters to be derived. However, relatively little effort has been put into the verification of this assumption. Some verification of the assumption is available by analyzing the results of the surface‐consistent decomposition of real seismic data. The surface‐consistent log‐amplitude spectra of land seismic data are able to provide convincing evidence that the source component collects effects of the source signature and near‐source structural effects, and that the receiver component collects receiver characteristics and near‐receiver structural effects. In addition, the offset component collects effects due to ground roll and average reflectivity, and the CDP component collects mostly random noise unless it is constrained to be smooth. Based on the results of this analysis, deconvolution filters should be constructed from the source and receiver components, while the offset and CDP components are discarded. The four‐component surface‐consistent decomposition can be performed efficiently by making use of a simple rearrangement of the Gauss‐Seidel matrix inversion equations. The algorithm requires just two passes through the prestack data volume, regardless of the sorted order of the data, so it is useful for both two‐dimensional and three‐dimensional (2-D and 3-D) data volumes.

Geophysics ◽  
1991 ◽  
Vol 56 (12) ◽  
pp. 2036-2047 ◽  
Author(s):  
G. L. Kinsland ◽  
J. A. McDonald ◽  
G. H. F. Gardner

A thin sand has been successfully imaged at great depth using a carefully designed, high‐resolution, three‐dimensional (3-D), seismic survey. The area of the survey was along a portion of the boundary between northeastern Vermilion Parish and southern Lafayette Parish about twelve miles south of Lafayette, Louisiana. Surface terrain was typically flat farmland at the southern edge of the Pleistocene Prairie Terrace and was ideal for this type of high‐resolution survey. The greatest elevation difference between any two source or receiver locations was about 6 ft (1.8 m). The target for this survey was the Cib jeff sand at a depth of about 13,400 ft (4084 m). The Cib jeff sand is within massive shales and is within a geopressured zone. Relative isolation of the Cib jeff sand by the surrounding shales makes the sand a good candidate for seismic imaging. After some preliminary field tests a survey was designed which used the crossed‐array method in which the source and receiver lines are at some angle to one another, usually orthogonal. Data were collected using a 1024-channel recording system with vibrators as sources. Receiver arrays were not used as it was possible to sweep with frequencies outside the frequency range of the ground roll. A total of [Formula: see text] seismic traces was collected. Many tests were also carried out in the processing of these data and it was found that large variations of offsets in the data volume resulted in deterioration in the quality of stacked data due to nonhyperbolic moveout. The migrated data volume was restricted to traces with less than 7500 ft (2286 m) source‐to‐receiver offset, and to the time window containing the Cib jeff sand, namely from 2.5 to 5.0 s. The Cib jeff sand was successfully imaged and the migrated data volume was interpreted using paper sections. As one would expect, the interpretation based on the total volume is more complex than the interpretation using available conventional two‐dimensional (2-D) lines. In particular the fault pattern interpretation based on 2-D seismic data and well logs is believed to be in error. The western bounding fault is placed further west and other faults were delineated through the reservoir when the interpretation was based on the total 3-D volume. Overall, we believe that this reservoir was mapped with more control than was possible with 2-D data.


2016 ◽  
Vol 4 (2) ◽  
pp. SG1-SG9 ◽  
Author(s):  
Marcus P. Cahoj ◽  
Sumit Verma ◽  
Bryce Hutchinson ◽  
Kurt J. Marfurt

The term acquisition footprint is commonly used to define patterns in seismic time and horizon slices that are closely correlated to the acquisition geometry. Seismic attributes often exacerbate footprint artifacts and may pose pitfalls to the less experienced interpreter. Although removal of the acquisition footprint is the focus of considerable research, the sources of such artifact acquisition footprint are less commonly discussed or illustrated. Based on real data examples, we have hypothesized possible causes of footprint occurrence and created them through synthetic prestack modeling. Then, we processed these models using the same workflows used for the real data. Computation of geometric attributes from the migrated synthetics found the same footprint artifacts as the real data. These models showed that acquisition footprint could be caused by residual ground roll, inaccurate velocities, and far-offset migration stretch. With this understanding, we have examined the real seismic data volume and found that the key cause of acquisition footprint was inaccurate velocity analysis.


Geophysics ◽  
1972 ◽  
Vol 37 (5) ◽  
pp. 769-787 ◽  
Author(s):  
J. W. C. Sherwood ◽  
P. H. Poe

An economic computer program can stack the data from several adjoining common depth points over a wide range of both dip and normal moveout. We can extract from this a set of seismic wavelets, each possessing a determined dip and normal moveout, which represent the original seismic data in an approximate and compressed form. The seismic wavelets resulting from the processing of a complete seismic line are stored for a variety of subsequent uses, such as the following: 1) Superimpose the wavelets, or a subset of them, to form a record section analogous to a conventional common‐depth‐point stacked section. This facilitates the construction of record sections consisting dominantly of either multiple or primary reflections. Other benefits can arise from improved signal‐to‐random‐noise ratio, the concurrent display of overlapping primary wavelets with widely different normal moveouts, and the elimination of the waveform stretching that occurs on the long offset traces with conventional normal moveout removal. 2) By displaying each picked wavelet as a short dip‐bar located at the correct time and spatial position and annotated with the estimated rms velocity, we can exhibit essentially continuous rms‐velocity data along each reflection. This information can be utilized for the estimation of interval and average velocities. For comparative purposes this velocity‐annotated dip‐bar display is normally formed on the same scale as the conventional common‐depth‐point stack section.


Geophysics ◽  
1987 ◽  
Vol 52 (9) ◽  
pp. 1175-1187 ◽  
Author(s):  
Robert J. Greaves ◽  
Terrance J. Fulp

Seismic reflection data were used to monitor the progress of an in‐situ combustion, enhanced oil recovery process. Three sets of three‐dimensional (3-D) data were collected during a one‐year period in order to map the extent and directions of propagation in time. Acquisition and processing parameters were identical for each survey so that direct one‐to‐one comparison of traces could be made. Seismic attributes were calculated for each common‐depth‐point data set, and in a unique application of seismic reflection data, the preburn attributes were subtracted from the midburn and postburn attributes. The resulting “difference volumes” of 3-D seismic data showed anomalies which were the basis for the interpretation shown in this case study. Profiles and horizon slices from the data sets clearly show the initiation and development of a bright spot in the reflection from the top of the reservoir and a dim spot in the reflection from a limestone below it. Interpretation of these anomalies is supported by information from postburn coring. The bright spot was caused by increased gas saturation along the top‐of‐reservoir boundary. From postburn core data, a map of burn volume distribution was made. In comparison, the bright spot covered a greater area, and it was concluded that combustion and injection gases had propagated ahead of the actual combustion zone. The dim spot anomaly shows good correlation with the burn volume in distribution and direction. Evidence from postburn logs supports the conclusion that the burn substantially decreased seismic velocity and increased seismic attenuation in the reservoir. Net burn thicknesses measured in the cores were used to calibrate the dim‐spot amplitude. With this calibration, the dim‐spot amplitude at each common depth point was inverted to net burn thickness and a map of estimated burn thickness was made from the seismic data.


2014 ◽  
Vol 962-965 ◽  
pp. 132-137
Author(s):  
Huan Song Ren ◽  
Shuang Fang Lu ◽  
Dian Shi Xiao

It is difficult to distinguish off from <5m fault in sections, because lateral variation of the amplitude and changes arising from differential compaction sand. Seismic coherence cube is to highlight those irrelevant seismic data. Draw three-dimensional seismic coherence with Partial wave analysis in the longitudinal and transverse. Ant agents: the ant, which found in the seismic data volume to meet the pre-fracture conditions, will release of a “signal”. It can call other regions ant for focusing on the breaks in its tracks, until the completion of the fault tracking and identification. Structural heterogeneity imaging can generate several geological attribute bodies, when compound with different geological attribute volume to different geology research target. It can extrude geologic feature we needed. Identifying a series of small faults which development between the main faults using a variety of small faults interpretation. Discovering and implementation of a number of block trap.


Geophysics ◽  
2020 ◽  
Vol 85 (4) ◽  
pp. V355-V365
Author(s):  
Julián L. Gómez ◽  
Danilo R. Velis

Dictionary learning (DL) is a machine learning technique that can be used to find a sparse representation of a given data set by means of a relatively small set of atoms, which are learned from the input data. DL allows for the removal of random noise from seismic data very effectively. However, when seismic data are contaminated with footprint noise, the atoms of the learned dictionary are often a mixture of data and coherent noise patterns. In this scenario, DL requires carrying out a morphological attribute classification of the atoms to separate the noisy atoms from the dictionary. Instead, we have developed a novel DL strategy for the removal of footprint patterns in 3D seismic data that is based on an augmented dictionary built upon appropriately filtering the learned atoms. The resulting augmented dictionary, which contains the filtered atoms and their residuals, has a high discriminative power in separating signal and footprint atoms, thus precluding the use of any statistical classification strategy to segregate the atoms of the learned dictionary. We filter the atoms using a domain transform filtering approach, a very efficient edge-preserving smoothing algorithm. As in the so-called coherence-constrained DL method, the proposed DL strategy does not require the user to know or adjust the noise level or the sparsity of the solution for each data set. Furthermore, it only requires one pass of DL and is shown to produce successful transfer learning. This increases the speed of the denoising processing because the augmented dictionary does not need to be calculated for each time slice of the input data volume. Results on synthetic and 3D public-domain poststack field data demonstrate effective footprint removal with accurate edge preservation.


2015 ◽  
Vol 2015 ◽  
pp. 1-7 ◽  
Author(s):  
Guxi Wang ◽  
Ling Chen ◽  
Si Guo ◽  
Yu Peng ◽  
Ke Guo

Seismic data processing is an important aspect to improve the signal to noise ratio. The main work of this paper is to combine the characteristics of seismic data, using wavelet transform method, to eliminate and control such random noise, aiming to improve the signal to noise ratio and the technical methods used in large data systems, so that there can be better promotion and application. In recent years, prestack data denoising of all-digital three-dimensional seismic data is the key to data processing. Contrapose the characteristics of all-digital three-dimensional seismic data, and, on the basis of previous studies, a new threshold function is proposed. Comparing between conventional hard threshold and soft threshold, this function not only is easy to compute, but also has excellent mathematical properties and a clear physical meaning. The simulation results proved that this method can well remove the random noise. Using this threshold function in actual seismic processing of unconventional lithologic gas reservoir with low porosity, low permeability, low abundance, and strong heterogeneity, the results show that the denoising method can availably improve seismic processing effects and enhance the signal to noise ratio (SNR).


2013 ◽  
Vol 31 (4) ◽  
pp. 619 ◽  
Author(s):  
Luiz Eduardo Soares Ferreira ◽  
Milton José Porsani ◽  
Michelângelo G. Da Silva ◽  
Giovani Lopes Vasconcelos

ABSTRACT. Seismic processing aims to provide an adequate image of the subsurface geology. During seismic processing, the filtering of signals considered noise is of utmost importance. Among these signals is the surface rolling noise, better known as ground-roll. Ground-roll occurs mainly in land seismic data, masking reflections, and this roll has the following main features: high amplitude, low frequency and low speed. The attenuation of this noise is generally performed through so-called conventional methods using 1-D or 2-D frequency filters in the fk domain. This study uses the empirical mode decomposition (EMD) method for ground-roll attenuation. The EMD method was implemented in the programming language FORTRAN 90 and applied in the time and frequency domains. The application of this method to the processing of land seismic line 204-RL-247 in Tacutu Basin resulted in stacked seismic sections that were of similar or sometimes better quality compared with those obtained using the fk and high-pass filtering methods.Keywords: seismic processing, empirical mode decomposition, seismic data filtering, ground-roll. RESUMO. O processamento sísmico tem como principal objetivo fornecer uma imagem adequada da geologia da subsuperfície. Nas etapas do processamento sísmico a filtragem de sinais considerados como ruídos é de fundamental importância. Dentre esses ruídos encontramos o ruído de rolamento superficial, mais conhecido como ground-roll . O ground-roll ocorre principalmente em dados sísmicos terrestres, mascarando as reflexões e possui como principais características: alta amplitude, baixa frequência e baixa velocidade. A atenuação desse ruído é geralmente realizada através de métodos de filtragem ditos convencionais, que utilizam filtros de frequência 1D ou filtro 2D no domínio fk. Este trabalho utiliza o método de Decomposição em Modos Empíricos (DME) para a atenuação do ground-roll. O método DME foi implementado em linguagem de programação FORTRAN 90, e foi aplicado no domínio do tempo e da frequência. Sua aplicação no processamento da linha sísmica terrestre 204-RL-247 da Bacia do Tacutu gerou como resultados, seções sísmicas empilhadas de qualidade semelhante e por vezes melhor, quando comparadas as obtidas com os métodos de filtragem fk e passa-alta.Palavras-chave: processamento sísmico, decomposição em modos empíricos, filtragem dados sísmicos, atenuação do ground-roll.


Author(s):  
Yanyan Ma ◽  
Peng Ding ◽  
Lanlan Li ◽  
Yang Liu ◽  
Ping Jin ◽  
...  

AbstractHeart diseases remain the top threat to human health, and the treatment of heart diseases changes with each passing day. Convincing evidence shows that three-dimensional (3D) printing allows for a more precise understanding of the complex anatomy associated with various heart diseases. In addition, 3D-printed models of cardiac diseases may serve as effective educational tools and for hands-on simulation of surgical interventions. We introduce examples of the clinical applications of different types of 3D printing based on specific cases and clinical application scenarios of 3D printing in treating heart diseases. We also discuss the limitations and clinically unmet needs of 3D printing in this context.


2013 ◽  
Vol 56 (7) ◽  
pp. 1200-1208 ◽  
Author(s):  
Yue Li ◽  
BaoJun Yang ◽  
HongBo Lin ◽  
HaiTao Ma ◽  
PengFei Nie

Sign in / Sign up

Export Citation Format

Share Document