THE DIGITAL PROCESSING OF SEISMIC DATA

Geophysics ◽  
1967 ◽  
Vol 32 (6) ◽  
pp. 988-1002 ◽  
Author(s):  
Daniel Silverman

The paper discusses the background of the problem of signal and noise in the seismic process, and the application of the principles of communication theory to this problem. The limitations of the seismic process are discussed along with the types of noises involved, the methods of rejecting noise, the use of filters to reduce noise, characteristics of filters, and the relationships between frequency domain, time domain, mathematical, and digital filters. In the discussion of the electronic data processing of seismic information, the characteristics of an ideal seismic digital computer system are developed in relation to the characteristics of seismic data. The choice between digital and analog field recording is discussed in relation to the needs of the seismic process and the quality of the seismic data. Among the mathematical processes discussed are velocity filtering and a number of types of Wiener filtering, including horizontal stacking, deghosting, deconvolution, and multitrace digital filtering.

2013 ◽  
Vol 300-301 ◽  
pp. 1669-1672
Author(s):  
Yong Li

In order to improve the quality and precision of seismic data, taking out or suppressing interference wave consisting in the earthquake wave will be one important link in the digital processing of seismic data .The fast Fournier transformation resolves big points N into certain dot’s DFT combinations .And then breaking a large number of multiply operations to add operations and a small quality of multiply operations, thus the computation speed of the Discrete Fourier Transformation (DFT) will be enhanced greatly. The widespread uses of FFT make it to be a powerful tool in digital signal processing. The present paper will introduce the quite comprehensive narration of the principle of filter, the characteristic of fast Fournier transformation algorithm principle as well as the realization.


Geophysics ◽  
1967 ◽  
Vol 32 (3) ◽  
pp. 414-414
Author(s):  
Daniel Silverman

In recent years there has been a great surge of interest in the geophysical industry in the digital processing of seismic data. This activity involves the application of statistical methods of analysis of time series. In a way it is a part of the general subject of communication theory. However, the direction this work has taken is in many respects quite divergent from communication theory as it is used in the communications industry. The divergence is so great, in fact, that if it were not for a small group of workers in this field in the early 1950's, it is doubtful whether we would today be in a position to do what is now rapidly becoming standard operating practice in the geophysical industry.


2021 ◽  
Vol 8 ◽  
Author(s):  
Mojtaba Akbari ◽  
Jay Carriere ◽  
Tyler Meyer ◽  
Ron Sloboda ◽  
Siraj Husain ◽  
...  

During an ultrasound (US) scan, the sonographer is in close contact with the patient, which puts them at risk of COVID-19 transmission. In this paper, we propose a robot-assisted system that automatically scans tissue, increasing sonographer/patient distance and decreasing contact duration between them. This method is developed as a quick response to the COVID-19 pandemic. It considers the preferences of the sonographers in terms of how US scanning is done and can be trained quickly for different applications. Our proposed system automatically scans the tissue using a dexterous robot arm that holds US probe. The system assesses the quality of the acquired US images in real-time. This US image feedback will be used to automatically adjust the US probe contact force based on the quality of the image frame. The quality assessment algorithm is based on three US image features: correlation, compression and noise characteristics. These US image features are input to the SVM classifier, and the robot arm will adjust the US scanning force based on the SVM output. The proposed system enables the sonographer to maintain a distance from the patient because the sonographer does not have to be holding the probe and pressing against the patient's body for any prolonged time. The SVM was trained using bovine and porcine biological tissue, the system was then tested experimentally on plastisol phantom tissue. The result of the experiments shows us that our proposed quality assessment algorithm successfully maintains US image quality and is fast enough for use in a robotic control loop.


Geophysics ◽  
2001 ◽  
Vol 66 (1) ◽  
pp. 40-41 ◽  
Author(s):  
Leon Thomsen

The topic of seismic anisotropy in exploration and exploitation has seen a great deal of progress in the past decade‐and‐a‐half. The principal reason for this is the increased (and increasing) quality of seismic data, of the processing done to it, and of the interpretation expected from it. No longer an academic subject of little practical interest, it is now often viewed as one of the crucial factors which, if not taken into account, severely hampers our effective use of the data. The following brief overview is not intended to be exhaustive, since any such attempt would surely be incomplete. However, it does provide a high‐level survey of the advances seen (at the end of this period) to be important by one who was closely involved, and it directly extrapolates this history to predict the future development of the topic.


2018 ◽  
Vol 7 (2.7) ◽  
pp. 794
Author(s):  
E Sai Sumanth ◽  
V Joseph ◽  
Dr K S Ramesh ◽  
Dr S Koteswara Rao

Investigation of signals reflected from earth’s surface and its crust helps in understanding its core structure. Wavelet transforms is one of the sophisticated tools for analyzing the seismic reflections. In the present work a synthetic seismic signal contaminated with noise is synthesized  and analyzed using Ormsby wavelet[1]. The wavelet transform has efficiently extracted the spectra of the synthetic seismic signal as it smoothens the noise present in the data and upgrades the flag quality of the seismic data due to termers. Ormsby wavelet gives the most redefined spectrum of the input wave so it could be used for the analysis of the seismic reflections. 


1995 ◽  
Vol 35 (1) ◽  
pp. 358 ◽  
Author(s):  
R. Lovibond ◽  
R.J. Suttill ◽  
J.E. Skinner ◽  
A.N. Aburas

The Penola Trough is an elongate, Late Jurassic to Early Cretaceous, NW-SE trending half graben filled mainly with synrift sediments of the Crayfish Group. Katnook-1 discovered gas in the basal Eumeralla Formation, but all commercial discoveries have been within the Crayfish Group, particularly the Pretty Hill Formation. Recent improvements in seismic data quality, in conjunction with additional well control, have greatly improved the understanding of the stratigraphy, structure and hydrocarbon prospectivity of the trough. Strati-graphic units within the Pretty Hill Formation are now mappable seismically. The maturity of potential source rocks within these deeper units has been modelled, and the distribution and quality of potential reservoir sands at several levels within the Crayfish Group have been studied using both well and seismic data. Evaluation of the structural history of the trough, the risk of a late carbon dioxide charge to traps, the direct detection of gas using seismic AVO analysis, and the petrophysical ambiguities recorded in wells has resulted in new insights. An important new play has been recognised on the northern flank of the Penola Trough: a gas and oil charge from mature source rocks directly overlying basement into a quartzose sand sequence referred to informally as the Sawpit Sandstone. This play was successfully tested in early 1994 by Wynn-1 which flowed both oil and gas during testing from the Sawpit Sandstone. In mid 1994, Haselgrove-1 discovered commercial quantities of gas in a tilted Pretty Hill Formation fault block adjacent to the Katnook Field. These recent discoveries enhance the prospectivity of the Penola Trough and of the Early Cretaceous sequence in the wider Otway Basin where these sediments are within reach of the drill.


Geophysics ◽  
1963 ◽  
Vol 28 (5) ◽  
pp. 831-841
Author(s):  
Lorenz Shock

The designation “Roll‐Along” is used to indicate the use of horizontal‐data‐stacking techniques when field data are derived by the shooting method. The term “Drop‐Along” is used for weight‐drop data. These methods have proven valuable in obtaining usable data in areas where the standard pattern techniques are ineffective, or where extreme multiplicity requirements make patterns economically impractical. Operational techniques have been developed for both Roll‐Along and Drop‐Along in which standard field recording equipment and cables are utilized. A discussion of these techniques and the associated data processing operations is presented. Record‐sections are used to illustrate the quality of data obtainable by these methods and to compare it to pattern shooting.


1972 ◽  
Vol 9 (4) ◽  
pp. 434-451 ◽  
Author(s):  
R. M. Clowes ◽  
E. R. Kanasewich

Seismic reflections at near-vertical incidence from within the lower crust were recorded along a 90 km continuous profile in southern Alberta. The digitized seismograms were filtered as a means of improving signal-to-noise ratios. Zero-phase bandpass filtering improves the quality of the records but also introduces a disadvantage; the narrow pass bands cause the filtered seismograms to become more oscillatory and exclusion of the higher frequencies destroys some of the record character which is useful in phase correlation. Velocity filtering is shown to be a more effective means of delineating individual reflection events, particularly if their apparent velocities vary widely. A record section compiled from such filtered data indicates a considerable number of reflecting horizons at various depths in the lower crust. When the reflections are migrated and a "wiggle" structure section compiled, the complexity of the lower crust in southern Alberta is revealed. Vertical relief reaches as much as 9 km over a horizontal distance of 40 km and lateral changes in the layered system can be noted.


Geophysics ◽  
2013 ◽  
Vol 78 (5) ◽  
pp. W31-W44 ◽  
Author(s):  
Anton Ziolkowski

I consider the problem of finding the impulse response, or Green’s function, from a measured response including noise, given an estimate of the source time function. This process is usually known as signature deconvolution. Classical signature deconvolution provides no measure of the quality of the result and does not separate signal from noise. Recovery of the earth impulse response is here formulated as the calculation of a Wiener filter in which the estimated source signature is the input and the measured response is the desired output. Convolution of this filter with the estimated source signature is the part of the measured response that is correlated with the estimated signature. Subtraction of the correlated part from the measured response yields the estimated noise, or the uncorrelated part. The fraction of energy not contained in this uncorrelated component is defined as the quality of the filter. If the estimated source signature contains errors, the estimated earth impulse response is incomplete, and the estimated noise contains signal, recognizable as trace-to-trace correlation. The method can be applied to many types of geophysical data, including earthquake seismic data, exploration seismic data, and controlled source electromagnetic data; it is illustrated here with examples of marine seismic and marine transient electromagnetic data.


Sign in / Sign up

Export Citation Format

Share Document