Coherency calculations in the presence of structural dip

Geophysics ◽  
1999 ◽  
Vol 64 (1) ◽  
pp. 104-111 ◽  
Author(s):  
Kurt J. Marfurt ◽  
V. Sudhaker ◽  
Adam Gersztenkorn ◽  
Kelly D. Crawford ◽  
Susan E. Nissen

We have used crosscorrelation, semblance, and eigenstructure algorithms to estimate coherency. The first two algorithms calculate coherency over a multiplicity of trial time lags or dips, with the dip having the highest coherency corresponding to the local dip of the reflector. Partially because of its greater computational cost, our original eigenstructure algorithm calculated coherency along an implicitly flat horizon. Although generalizing the eigenstructure algorithm to search over a range of test dips allowed us to image coherency in the presence of steeply dipping structures, we were somewhat surprised that this generalization concomitantly degenerated the quality of the fault images in flatter dip areas. Because it is a local estimation of reflector dip (including as few as five traces), the multidip coherency estimate provides an algorithmically correct, but interpretationally undesirable, estimate of the best apparent dip that explained the offset reflectors across a fault. We ameliorate this problem using two methods, both of which require the smoothing of a locally inaccurate estimate of regional dip. We then calculate our eigenstructure estimate of coherency only along the dip of the reflector, thereby providing maximum lateral resolution of reflector discontinuities. We are thus both better able to explain the superior results obtained by our earliest eigenstructure analysis along interpreted horizon slices, yet able to extend this resolution to steeply dipping reflectors on uninterpreted cubes of seismic data.

Author(s):  
Yudong Qiu ◽  
Daniel Smith ◽  
Chaya Stern ◽  
mudong feng ◽  
Lee-Ping Wang

<div>The parameterization of torsional / dihedral angle potential energy terms is a crucial part of developing molecular mechanics force fields.</div><div>Quantum mechanical (QM) methods are often used to provide samples of the potential energy surface (PES) for fitting the empirical parameters in these force field terms.</div><div>To ensure that the sampled molecular configurations are thermodynamically feasible, constrained QM geometry optimizations are typically carried out, which relax the orthogonal degrees of freedom while fixing the target torsion angle(s) on a grid of values.</div><div>However, the quality of results and computational cost are affected by various factors on a non-trivial PES, such as dependence on the chosen scan direction and the lack of efficient approaches to integrate results started from multiple initial guesses.</div><div>In this paper we propose a systematic and versatile workflow called \textit{TorsionDrive} to generate energy-minimized structures on a grid of torsion constraints by means of a recursive wavefront propagation algorithm, which resolves the deficiencies of conventional scanning approaches and generates higher quality QM data for force field development.</div><div>The capabilities of our method are presented for multi-dimensional scans and multiple initial guess structures, and an integration with the MolSSI QCArchive distributed computing ecosystem is described.</div><div>The method is implemented in an open-source software package that is compatible with many QM software packages and energy minimization codes.</div>


Geophysics ◽  
2016 ◽  
Vol 81 (6) ◽  
pp. A17-A21 ◽  
Author(s):  
Juan I. Sabbione ◽  
Mauricio D. Sacchi

The coefficients that synthesize seismic data via the hyperbolic Radon transform (HRT) are estimated by solving a linear-inverse problem. In the classical HRT, the computational cost of the inverse problem is proportional to the size of the data and the number of Radon coefficients. We have developed a strategy that significantly speeds up the implementation of time-domain HRTs. For this purpose, we have defined a restricted model space of coefficients applying hard thresholding to an initial low-resolution Radon gather. Then, an iterative solver that operated on the restricted model space was used to estimate the group of coefficients that synthesized the data. The method is illustrated with synthetic data and tested with a marine data example.


Geophysics ◽  
2001 ◽  
Vol 66 (1) ◽  
pp. 40-41 ◽  
Author(s):  
Leon Thomsen

The topic of seismic anisotropy in exploration and exploitation has seen a great deal of progress in the past decade‐and‐a‐half. The principal reason for this is the increased (and increasing) quality of seismic data, of the processing done to it, and of the interpretation expected from it. No longer an academic subject of little practical interest, it is now often viewed as one of the crucial factors which, if not taken into account, severely hampers our effective use of the data. The following brief overview is not intended to be exhaustive, since any such attempt would surely be incomplete. However, it does provide a high‐level survey of the advances seen (at the end of this period) to be important by one who was closely involved, and it directly extrapolates this history to predict the future development of the topic.


2013 ◽  
Vol 378 ◽  
pp. 546-551 ◽  
Author(s):  
Joanna Strug ◽  
Barbara Strug

Mutation testing is an effective technique for assessing quality of tests provided for a system. However it suffers from high computational cost of executing mutants of the system. In this paper a method of classifying such mutants is proposed. This classification is based on using an edit distance kernel and k-NN classifier. Using the results of this classification it is possible to predict whether a mutant would be detected by tests or not. Thus the application of the approach can help to lower the number of mutants that have to be executed and so also to lower the cost of using the mutation testing.


Geophysics ◽  
2021 ◽  
pp. 1-86
Author(s):  
Wei Chen ◽  
Omar M. Saad ◽  
Yapo Abolé Serge Innocent Oboué ◽  
Liuqing Yang ◽  
Yangkang Chen

Most traditional seismic denoising algorithms will cause damages to useful signals, which are visible from the removed noise profiles and are known as signal leakage. The local signal-and-noise orthogonalization method is an effective method for retrieving the leaked signals from the removed noise. Retrieving leaked signals while rejecting the noise is compromised by the smoothing radius parameter in the local orthogonalization method. It is not convenient to adjust the smoothing radius because it is a global parameter while the seismic data is highly variable locally. To retrieve the leaked signals adaptively, we propose a new dictionary learning method. Because of the patch-based nature of the dictionary learning method, it can adapt to the local feature of seismic data. We train a dictionary of atoms that represent the features of the useful signals from the initially denoised data. Based on the learned features, we retrieve the weak leaked signals from the noise via a sparse co ding step. Considering the large computational cost when training a dictionary from high-dimensional seismic data, we leverage a fast dictionary up dating algorithm, where the singular value decomposition (SVD) is replaced via the algebraic mean to update the dictionary atom. We test the performance of the proposed method on several synthetic and field data examples, and compare it with that from the state-of-the-art local orthogonalization method.


Author(s):  
Padmalaya Nayak ◽  
Shelendra Kumar Sharma

With the rapid growth of Cloud Computing, various diverse applications are growing exponentially through large data centers with the use of Internet. Cloud gaming is one of the most novel service applications that helps to store the video games in cloud and client can access the games as audio/video streams. Cloud gaming in practice substantially reduces the computational cost at the client side and enables the use of thin clients. Further, Quality of Service (QoS) may be affected through cloud gaming by introducing access latency. The objective of this chapter is to bring the impact and effectiveness of cloud gaming application on users, Health care, Entertainment, and Education.


Geophysics ◽  
2020 ◽  
Vol 85 (2) ◽  
pp. V223-V232 ◽  
Author(s):  
Zhicheng Geng ◽  
Xinming Wu ◽  
Sergey Fomel ◽  
Yangkang Chen

The seislet transform uses the wavelet-lifting scheme and local slopes to analyze the seismic data. In its definition, the designing of prediction operators specifically for seismic images and data is an important issue. We have developed a new formulation of the seislet transform based on the relative time (RT) attribute. This method uses the RT volume to construct multiscale prediction operators. With the new prediction operators, the seislet transform gets accelerated because distant traces get predicted directly. We apply our method to synthetic and real data to demonstrate that the new approach reduces computational cost and obtains excellent sparse representation on test data sets.


2018 ◽  
Vol 7 (2.7) ◽  
pp. 794
Author(s):  
E Sai Sumanth ◽  
V Joseph ◽  
Dr K S Ramesh ◽  
Dr S Koteswara Rao

Investigation of signals reflected from earth’s surface and its crust helps in understanding its core structure. Wavelet transforms is one of the sophisticated tools for analyzing the seismic reflections. In the present work a synthetic seismic signal contaminated with noise is synthesized  and analyzed using Ormsby wavelet[1]. The wavelet transform has efficiently extracted the spectra of the synthetic seismic signal as it smoothens the noise present in the data and upgrades the flag quality of the seismic data due to termers. Ormsby wavelet gives the most redefined spectrum of the input wave so it could be used for the analysis of the seismic reflections. 


1995 ◽  
Vol 35 (1) ◽  
pp. 358 ◽  
Author(s):  
R. Lovibond ◽  
R.J. Suttill ◽  
J.E. Skinner ◽  
A.N. Aburas

The Penola Trough is an elongate, Late Jurassic to Early Cretaceous, NW-SE trending half graben filled mainly with synrift sediments of the Crayfish Group. Katnook-1 discovered gas in the basal Eumeralla Formation, but all commercial discoveries have been within the Crayfish Group, particularly the Pretty Hill Formation. Recent improvements in seismic data quality, in conjunction with additional well control, have greatly improved the understanding of the stratigraphy, structure and hydrocarbon prospectivity of the trough. Strati-graphic units within the Pretty Hill Formation are now mappable seismically. The maturity of potential source rocks within these deeper units has been modelled, and the distribution and quality of potential reservoir sands at several levels within the Crayfish Group have been studied using both well and seismic data. Evaluation of the structural history of the trough, the risk of a late carbon dioxide charge to traps, the direct detection of gas using seismic AVO analysis, and the petrophysical ambiguities recorded in wells has resulted in new insights. An important new play has been recognised on the northern flank of the Penola Trough: a gas and oil charge from mature source rocks directly overlying basement into a quartzose sand sequence referred to informally as the Sawpit Sandstone. This play was successfully tested in early 1994 by Wynn-1 which flowed both oil and gas during testing from the Sawpit Sandstone. In mid 1994, Haselgrove-1 discovered commercial quantities of gas in a tilted Pretty Hill Formation fault block adjacent to the Katnook Field. These recent discoveries enhance the prospectivity of the Penola Trough and of the Early Cretaceous sequence in the wider Otway Basin where these sediments are within reach of the drill.


Sign in / Sign up

Export Citation Format

Share Document