A new approach to signature analysis in seismic interpretation using an interactive workstation

1989 ◽  
Vol 20 (2) ◽  
pp. 271
Author(s):  
G.A.D. Paterson

Initial hydrocarbon discoveries normally lead to a succession of wells on the same or similiar seismic trends, sometimes with a succession of dry holes. The problem posed is, given some initial success, to what extent can seismic be used to predict lithology and prevent these occurrences. Satellite technologists already utilize similar methods to quickly identify similar terrain features using an image 'bench mark', and multiple signals. The bench mark for the seismic interpreter is the well, and the multiple signals are the seismic attributes. The tool used to bring these together is the interpretation workstation. A demonstration of the technique on synthetic data displays good results, dependent on several factors. Future work will be directed at evaluating the method on field data and together with other lithology prediction methods.

Geophysics ◽  
2006 ◽  
Vol 71 (1) ◽  
pp. J1-J9 ◽  
Author(s):  
João B. C. Silva ◽  
Valéria C. F. Barbosa

We have developed a new approach for estimating the location and geometry of several density anomalies that give rise to a complex, interfering gravity field. The user interactively defines the assumed outline of the true gravity sources in terms of points and line segments, and the method estimates sources closest to the specified outline to achieve a match between the predicted and observed gravity fields. Each gravity source is assumed to be a homogeneous body with a known density contrast; different density contrasts may be assigned to each source. Tests with synthetic data show that the method can be of use in estimating (1) multiple laterally adjacent and closely situated gravity sources, (2) single gravity sources consisting of several homogeneous compartments with different density contrasts, and (3) two gravity sources with different density contrasts of the same sign, one totally enclosed by the other. The method is also applied to three different sets of field data where the gravity sources belong to the same categories established in the tests with synthetic data. The method produces solutions consistent with the known geologic attributes of the gravity sources, illustrating its potential practicality.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Manzar Fawad ◽  
Nazmul Haque Mondol

AbstractGeological CO2 storage can be employed to reduce greenhouse gas emissions to the atmosphere. Depleted oil and gas reservoirs, deep saline aquifers, and coal beds are considered to be viable subsurface CO2 storage options. Remote monitoring is essential for observing CO2 plume migration and potential leak detection during and after injection. Leak detection is probably the main risk, though overall monitoring for the plume boundaries and verification of stored volumes are also necessary. There are many effective remote CO2 monitoring techniques with various benefits and limitations. We suggest a new approach using a combination of repeated seismic and electromagnetic surveys to delineate CO2 plume and estimate the gas saturation in a saline reservoir during the lifetime of a storage site. This study deals with the CO2 plume delineation and saturation estimation using a combination of seismic and electromagnetic or controlled-source electromagnetic (EM/CSEM) synthetic data. We assumed two scenarios over a period of 40 years; Case 1 was modeled assuming both seismic and EM repeated surveys were acquired, whereas, in Case 2, repeated EM surveys were taken with only before injection (baseline) 3D seismic data available. Our results show that monitoring the CO2 plume in terms of extent and saturation is possible both by (i) using a repeated seismic and electromagnetic, and (ii) using a baseline seismic in combination with repeated electromagnetic data. Due to the nature of the seismic and EM techniques, spatial coverage from the reservoir's base to the surface makes it possible to detect the CO2 plume’s lateral and vertical migration. However, the CSEM low resolution and depth uncertainties are some limitations that need consideration. These results also have implications for monitoring oil production—especially with water flooding, hydrocarbon exploration, and freshwater aquifer identification.


Geophysics ◽  
2011 ◽  
Vol 76 (4) ◽  
pp. F239-F250 ◽  
Author(s):  
Fernando A. Monteiro Santos ◽  
Hesham M. El-Kaliouby

Joint or sequential inversion of direct current resistivity (DCR) and time-domain electromagnetic (TDEM) data commonly are performed for individual soundings assuming layered earth models. DCR and TDEM have different and complementary sensitivity to resistive and conductive structures, making them suitable methods for the application of joint inversion techniques. This potential joint inversion of DCR and TDEM methods has been used by several authors to reduce the ambiguities of the models calculated from each method separately. A new approach for joint inversion of these data sets, based on a laterally constrained algorithm, was found. The method was developed for the interpretation of soundings collected along a line over a 1D or 2D geology. The inversion algorithm was tested on two synthetic data sets, as well as on field data from Saudi Arabia. The results show that the algorithm is efficient and stable in producing quasi-2D models from DCR and TDEM data acquired in relatively complex environments.


2010 ◽  
Vol 14 (3) ◽  
pp. 545-556 ◽  
Author(s):  
J. Rings ◽  
J. A. Huisman ◽  
H. Vereecken

Abstract. Coupled hydrogeophysical methods infer hydrological and petrophysical parameters directly from geophysical measurements. Widespread methods do not explicitly recognize uncertainty in parameter estimates. Therefore, we apply a sequential Bayesian framework that provides updates of state, parameters and their uncertainty whenever measurements become available. We have coupled a hydrological and an electrical resistivity tomography (ERT) forward code in a particle filtering framework. First, we analyze a synthetic data set of lysimeter infiltration monitored with ERT. In a second step, we apply the approach to field data measured during an infiltration event on a full-scale dike model. For the synthetic data, the water content distribution and the hydraulic conductivity are accurately estimated after a few time steps. For the field data, hydraulic parameters are successfully estimated from water content measurements made with spatial time domain reflectometry and ERT, and the development of their posterior distributions is shown.


2018 ◽  
Vol 10 (2) ◽  
pp. 99-118
Author(s):  
Andre Oboler ◽  
Karen Connelly

The Cyber-Racism and Community Resilience (CRaCR) project included an examination into features of online communities of resistance and solidarity. This work formed a key part of the project’s focus on resilience and produced a deeper understanding of a range of types of actors working in this space and how they might individually contribute effectively to creating resilience. The need for new synergies between different types of stakeholders and approaches was highlighted as an area of future work. This paper explores a design for that future work that builds and supports online communities of resistance and solidarity by drawing on the lessons from the earlier research and extending them. This new work both presents a model for cooperation and explains how different stakeholders can positively engage under the model in a smarter way. That is, through a system which facilitates Solidarity in Moving Against Racism Together while Enabling Resilience. This new approach draws on the strengths of individuals actors, but also seeks to turn points of weakness for one actor into opportunities for cooperation that strengthen the system as a whole.


2014 ◽  
Vol 27 (1) ◽  
pp. 67-87
Author(s):  
João Tiago Ribeiro ◽  
Rui Rijo ◽  
António Leal

This research aims to create a new approach for spider maps production that results in a fast and automatic method having as input only network location data. Schematization task is commonly done by hand or by purely graphics software. This is a difficult and time consuming task that also needs a skilled map designer, which results in an expensive outcome. A configurable force-directed algorithm allows fast creation of eye-pleasing schematic maps, avoiding labor-intensive manual arrangement. In the other hand, different sets of design rules and constraints may be used to quickly generate alternatives, and allow the configuration of a distinctive graphic style. This document presents some of the rules and constraints that may be used to output a map that meets certain criteria in order to be used as a spider map in transportation systems. We present results with real public transport network datasets, and discuss possible evaluation criteria. The present work introduces a new set of experimental validations that confirm the previous research but also leading to new open issues for future work.


2019 ◽  
Vol 221 (1) ◽  
pp. 87-96
Author(s):  
S Malecki ◽  
R-U Börner ◽  
K Spitzer

SUMMARY We present a procedure for localizing underground positions using a time-domain inductive electromagnetic (EM) method. The position to be localized is associated with an EM receiver placed inside the Earth. An EM field is generated by one or more transmitters located at known positions at the Earth’s surface. We then invert the EM field data for the receiver positions using a trust-region algorithm. For any given time regime and source–receiver geometry, the propagation of the electromagnetic fields is determined by the electrical conductivity distribution within the Earth. We show that it is sufficient to use a simple 1-D model to recover the receiver positions with reasonable accuracy. Generally, we demonstrate the robustness of the presented approach. Using confidence ellipses and confidence intervals we assess the accuracy of the recovered location data. The proposed method has been extensively tested against synthetic data obtained by numerical experiments. Furthermore, we have successfully carried out a location recovery using field data. The field data were recorded within a borehole in Alberta (Canada) at 101.4 m depth. The recovered location of the borehole receiver differs from the actual location by 0.70 m in the horizontal plane and by 0.82 m in depth.


Geophysics ◽  
2017 ◽  
Vol 82 (5) ◽  
pp. W31-W45 ◽  
Author(s):  
Necati Gülünay

The old technology [Formula: see text]-[Formula: see text] deconvolution stands for [Formula: see text]-[Formula: see text] domain prediction filtering. Early versions of it are known to create signal leakage during their application. There have been recent papers in geophysical publications comparing [Formula: see text]-[Formula: see text] deconvolution results with the new technologies being proposed. These comparisons will be most effective if the best existing [Formula: see text]-[Formula: see text] deconvolution algorithms are used. This paper describes common [Formula: see text]-[Formula: see text] deconvolution algorithms and studies signal leakage occurring during their application on simple models, which will hopefully provide a benchmark for the readers in choosing [Formula: see text]-[Formula: see text] algorithms for comparison. The [Formula: see text]-[Formula: see text] deconvolution algorithms can be classified by their use of data which lead to transient or transient-free matrices and hence windowed or nonwindowed autocorrelations, respectively. They can also be classified by the direction they are predicting: forward design and apply; forward design and apply followed by backward design and apply; forward design and apply followed by application of a conjugated forward filter in the backward direction; and simultaneously forward and backward design and apply, which is known as noncausal filter design. All of the algorithm types mentioned above are tested, and the results of their analysis are provided in this paper on noise free and noisy synthetic data sets: a single dipping event, a single dipping event with a simple amplitude variation with offset, and three dipping events. Finally, the results of applying the selected algorithms on field data are provided.


Geophysics ◽  
2020 ◽  
Vol 85 (5) ◽  
pp. J85-J98
Author(s):  
Shuang Liu ◽  
Xiangyun Hu ◽  
Dalian Zhang ◽  
Bangshun Wei ◽  
Meixia Geng ◽  
...  

Natural remanent magnetization acts as a record of the previous orientations of the earth’s magnetic field, and it is an important feature when studying geologic phenomena. The so-called IDQ curve is used to describe the relationship between the inclination ( I) and declination ( D) of remanent magnetization and the Köenigsberger ratio ( Q). Here, we construct the IDQ curve using data on ground and airborne magnetic anomalies. The curve is devised using modified approaches for estimating the total magnetization direction, e.g., identifying the maximal position of minimal reduced-to-the-pole fields or identifying correlations between total and vertical reduced-to-the-pole field gradients. The method is tested using synthetic data, and the results indicate that the IDQ curve can provide valuable information on the remanent magnetization direction based on available data on the Köenigsberger ratio. Then, the method is used to interpret field data from the Yeshan region in eastern China, where ground anomalies have been produced by igneous rocks, including diorite and basalt, which occur along with magnetite and hematite ore bodies. The IDQ curves for 24 subanomalies are constructed, and these curves indicate two main distribution clusters of remanent magnetization directions corresponding to different structural units of magma intrusion and help identify the lithologies of the magnetic sources in areas covered by Quaternary sediments. The estimated remanent magnetization directions for Cenozoic basalt are consistent with measurements made in paleomagnetism studies. The synthetic and field data indicate that the IDQ curve can be used to efficiently estimate the remanent magnetization direction from a magnetic anomaly, which could help with our understanding of geologic processes in an area.


Geophysics ◽  
2020 ◽  
Vol 85 (4) ◽  
pp. KS127-KS138 ◽  
Author(s):  
Yujin Liu ◽  
Yue Ma ◽  
Yi Luo

Locating microseismic source positions using seismic energy emitted from hydraulic fracturing is essential for choosing optimal fracking parameters and maximizing the fracturing effects in hydrocarbon exploitation. Interferometric crosscorrelation migration (ICCM) and zero-lag autocorrelation of time-reversal imaging (ATRI) are two important passive seismic source locating approaches that are proposed independently and seem to be substantially different. We have proven that these two methods are theoretically identical and produce very similar images. Moreover, we have developed cross-coherence that uses normalization by the spectral amplitude of each of the traces, rather than crosscorrelation or deconvolution, to improve the ICCM and ATRI methods. The adopted method enhances the spatial resolution of the source images and is particularly effective in the presence of highly variable and strong additive random noise. Synthetic and field data tests verify the equivalence of the conventional ICCM and ATRI and the equivalence of their improved versions. Compared with crosscorrelation- and deconvolution-based source locating methods, our approach shows a high-resolution property and antinoise capability in numerical tests using synthetic data with single and multiple sources, as well as field data.


Sign in / Sign up

Export Citation Format

Share Document