On: “The Equivalent Source Technique” by C. N. G. Dampney (GEOPHYSICS, February 1969, p. 39–53)

Geophysics ◽  
1970 ◽  
Vol 35 (1) ◽  
pp. 158-159
Author(s):  
Amalendu Roy

The following comments should not be taken as indicative of my overall appreciation of the paper, which I have read with immense interest: (1) The title of a paper should be a little more expressive, where possible. In this case, I would have suggested “Equivalent Source Technique in Gravity Interpretation.” This can prevent some readers from making a mistake in choosing titles of interest from the cover. (2) The equivalent source technique, in a more elementary form, was used for upward continuation of gravity data in a paper by Roy & Burman (1960).

Geophysics ◽  
1993 ◽  
Vol 58 (8) ◽  
pp. 1074-1083 ◽  
Author(s):  
D. Bhaskara Rao ◽  
M. J. Prakash ◽  
N. Ramesh Babu

The decrease of density contrast in sedimentary basins can often be approximated by an exponential function. Theoretical Fourier transforms are derived for symmetric trapezoidal, vertical fault, vertical prism, syncline, and anticline models. This is desirable because there are no equivalent closed form solutions in the space domain for these models combined with an exponential density contrast. These transforms exhibit characteristic minima, maxima, and zero values, and hence graphical methods have been developed for interpretation of model parameters. After applying end corrections to improve the discrete transforms of observed gravity data, the transforms are interpreted for model parameters. This method is first tested on two synthetic models, then applied to gravity anomalies over the San Jacinto graben and Los Angeles basin.


2020 ◽  
Author(s):  
Leonardo Uieda ◽  
Santiago Soler

<p>We investigate the use of cross-validation (CV) techniques to estimate the accuracy of equivalent-source (also known as equivalent-layer) models for interpolation and processing of potential-field data. Our preliminary results indicate that some common CV algorithms (e.g., random permutations and k-folds) tend to overestimate the accuracy. We have found that blocked CV methods, where the data are split along spatial blocks instead of randomly, provide more conservative and realistic accuracy estimates. Beyond evaluating an equivalent-source model's performance, cross-validation can be used to automatically determine configuration parameters, like source depth and amount of regularization, that maximize prediction accuracy and avoid over-fitting.</p><p>Widely used in gravity and magnetic data processing, the equivalent-source technique consists of a linear model (usually point sources) used to predict the observed field at arbitrary locations. Upward-continuation, interpolation, gradient calculations, leveling, and reduction-to-the-pole can be performed simultaneously by using the model to make predictions (i.e., forward modelling). Likewise, the use of linear models to make predictions is the backbone of many machine learning (ML) applications. The predictive performance of ML models is usually evaluated through cross-validation, in which the data are split (usually randomly) into a training set and a validation set. Models are fit on the training set and their predictions are evaluated using the validation set using a goodness-of-fit metric, like the mean square error or the R² coefficient of determination. Many cross-validation methods exist in the literature, varying in how the data are split and how this process is repeated. Prior research from the statistical modelling of ecological data suggests that prediction accuracy is usually overestimated by traditional CV methods when the data are spatially auto-correlated. This issue can be mitigated by splitting the data along spatial blocks rather than randomly. We conducted experiments on synthetic gravity data to investigate the use of traditional and blocked CV methods in equivalent-source interpolation. We found that the overestimation problem also occurs and that more conservative accuracy estimates are obtained when applying blocked versions of random permutations and k-fold. Further studies need to be conducted to generalize these findings to upward-continuation, reduction-to-the-pole, and derivative calculation.</p><p>Open-source software implementations of the equivalent-source and blocked cross-validation (in progress) methods are available in the Python libraries Harmonica and Verde, which are part of the Fatiando a Terra project (www.fatiando.org).</p>


Geophysics ◽  
2007 ◽  
Vol 72 (2) ◽  
pp. I13-I22 ◽  
Author(s):  
Fernando J. Silva Dias ◽  
Valeria C. Barbosa ◽  
João B. Silva

We present a new semiautomatic gravity interpretation method for estimating a complex interface between two media containing density heterogeneities (referred to as interfering sources) that give rise to a complex and interfering gravity field. The method combines a robust fitting procedure and the constraint that the interface is very smooth near the interfering sources, whose approximate horizontal coordinates are defined by the user. The proposed method differs from the regional-residual separation techniques by using no spectral content assumption about the anomaly produced by the interface to be estimated, i.e., the interface can produce a gravity response containing both low- and high-wavenumber features. As a result, it may be applied to map the relief of a complex interface in a geologic setting containing either shallow or deep-seated interfering sources. Tests conducted with synthetic data show that the method can be of utility in estimating the basement relief of a sedimentary basin in the presence of salt layers and domes or in the presence of mafic intrusions in the basement or in both basement and the sedimentary section. The method was applied to real gravity data from two geologic settings having different kinds of interfering sources and interfaces to be interpreted: (1) the interface between the upper and lower crusts over the Bavali shear zone of southern India and (2) the anorthosite-tonalite interface over the East Bull Lake gabbro-anorthosite complex outcrop in Ontario, Canada.


Geophysics ◽  
1963 ◽  
Vol 28 (3) ◽  
pp. 369-378 ◽  
Author(s):  
Sigmund Hammer

“Old gravity data never die.” Review interpretations of gravity surveys can be made whenever warranted by new geological concepts, development information, or improved techniques. Important additional uses can be gained by extending the interpretation to deeper horizons by calculating and subtracting the gravity effects of overlying strata whose structure becomes known in detail from shallower development. This constitutes the new technique reported in this paper. An essential new factor which makes gravity “stripping” practicable is the advent of the gamma‐gamma density log which determines subsurface density relations in strata penetrated by development drilling. Combined stratigraphic and density information defines the mass anomalies in the upper strata. Subtracting the calculated shallower gravity influences improves the definition of the deeper gravity prospects. Applications of the method are illustrated by selected examples.


Geophysics ◽  
1985 ◽  
Vol 50 (12) ◽  
pp. 2709-2719 ◽  
Author(s):  
Corine Prieto ◽  
Carolyn Perkins ◽  
Ernest Berkman

An interpretation is presented of a 219 km regional profile which traverses the eastern Columbia River Plateau in Washington State. Aeromagnetic, magnetotelluric (MT), and gravity data were first interpreted separately. All three data sets then were satisfied by a single geologic model. The objective of this case study is to illustrate the individual contributions derived from these three geophysical data sets to a final integrated interpretation. The aeromagnetic interpretation has produced regional structural information and data from which rock compositions can be inferred. The MT interpretation shows that the basalt/sediment interface can be determined, and thus a relative sediment thickness can be inferred. The gravity interpretation is dependent upon an additional method to determine either the basalt or basement horizon. In order for the gravity interpretation to approximate depth to basement or sedimentary thickness, the base of the basalt must be determined from another scientific method. From comparison of the regional structural results of the three geophysical techniques we conclude that aeromagnetic or MT data can be used to determine major structural trends. Reasonable rock compositions are also determined from the combined data sets. The interpreter must be aware of the different rock properties measured by each tool when performing an integrated interpretation; comparisons between the various techniques must be based upon similar assumptions. We recommend that detailed, integrated models be included for a thorough evaluation of any basalt‐covered area. The analysis of rock composition and regional structural information thus derived provides a sound basis for a regional tectonic interpretation and subsequent prospect evaluation.


Geophysics ◽  
1965 ◽  
Vol 30 (3) ◽  
pp. 424-438 ◽  
Author(s):  
Mark E. Odegard ◽  
Joseph W. Berg

The gravitational anomalies of simple bodies (sphere, cylinder, and fault) were used to develop methods for analyzing gravity data in the frequency domain. The Fourier transforms of the functional representations of the theoretical gravitational anomalies of these bodies were obtained. Mathematical relations were formulated between the transform‐versus‐frequency relationships and the depths and sizes of the bodies. Compound gravity anomalies (multiple cylinders, fault, and cylinder) were analyzed, and the transforms were reduced to transforms of anomalies due to individual simple bodies. These methods of analysis were applied to theoretical anomalies using numerical techniques, and the accuracy of both depth and size determinations was within a few percent in all cases.


Sign in / Sign up

Export Citation Format

Share Document