Interactive gravity inversion

Geophysics ◽  
2006 ◽  
Vol 71 (1) ◽  
pp. J1-J9 ◽  
Author(s):  
João B. C. Silva ◽  
Valéria C. F. Barbosa

We have developed a new approach for estimating the location and geometry of several density anomalies that give rise to a complex, interfering gravity field. The user interactively defines the assumed outline of the true gravity sources in terms of points and line segments, and the method estimates sources closest to the specified outline to achieve a match between the predicted and observed gravity fields. Each gravity source is assumed to be a homogeneous body with a known density contrast; different density contrasts may be assigned to each source. Tests with synthetic data show that the method can be of use in estimating (1) multiple laterally adjacent and closely situated gravity sources, (2) single gravity sources consisting of several homogeneous compartments with different density contrasts, and (3) two gravity sources with different density contrasts of the same sign, one totally enclosed by the other. The method is also applied to three different sets of field data where the gravity sources belong to the same categories established in the tests with synthetic data. The method produces solutions consistent with the known geologic attributes of the gravity sources, illustrating its potential practicality.

Geophysics ◽  
1971 ◽  
Vol 36 (3) ◽  
pp. 605-608 ◽  
Author(s):  
Edwin S. Robinson

Investigation of geological structure by gravimetric and magnetic field surveys requires consideration of relationships between gravity anomaly and magnetic anomaly generating sources. The possibility of using Poisson’s Relation to examine magnetic and gravity fields related to a common source is intriguing. This relation is expressed as follows: [Formula: see text] (1) where A (x, y, z) is the magnetic field potential and U (x, y, z) is the gravity field potential at a point in space due to a source of uniform density ρ and uniform magnetization I in the direction α. This expression has been used to derive magnetic anomalies over idealized forms (Nettleton, 1940) and, by Baranov (1957), to extract pseudogravity fields from magnetic field data. The purpose of this paper is to develop an expression for extracting a pseudomagnetic field from gravity field data and to examine the practical applications of this expression.


Geosciences ◽  
2021 ◽  
Vol 11 (11) ◽  
pp. 467
Author(s):  
Daniele Sampietro ◽  
Martina Capponi

The exploitation of gravity fields in order to retrieve information about subsurface geological structures is sometimes considered a second rank method, in favour of other geophysical methods, such as seismic, able to provide a high resolution detailed picture of the main geological horizons. Within the current work we prove, through a realistic synthetic case study, that the gravity field, thanks to the availability of freely of charge high resolution global models and to the improvements in the gravity inversion methods, can represent a valid and cheap tool to complete and enhance geophysical modelling of the Earth’s crust. Three tests were carried out: In the first one a simple two-layer problem was considered, while in tests two and three we considered two more realistic scenarios in which the availability on the study area of constraints derived from 3D or 2D seismic surveys were simulated. In all the considered test cases, in which we try to simulate real-life scenarios, the gravity field, inverted by means of an advanced Bayesian technique, was able to obtain a final solution closer to the (simulated) real model than the assumed a priori information, typically halving the uncertainties in the geometries of the main geological horizons with respect to the initial model.


1989 ◽  
Vol 20 (2) ◽  
pp. 271
Author(s):  
G.A.D. Paterson

Initial hydrocarbon discoveries normally lead to a succession of wells on the same or similiar seismic trends, sometimes with a succession of dry holes. The problem posed is, given some initial success, to what extent can seismic be used to predict lithology and prevent these occurrences. Satellite technologists already utilize similar methods to quickly identify similar terrain features using an image 'bench mark', and multiple signals. The bench mark for the seismic interpreter is the well, and the multiple signals are the seismic attributes. The tool used to bring these together is the interpretation workstation. A demonstration of the technique on synthetic data displays good results, dependent on several factors. Future work will be directed at evaluating the method on field data and together with other lithology prediction methods.


2021 ◽  
Author(s):  
Bart Root ◽  
Josef Sebera ◽  
Wolfgang Szwillus ◽  
Cedric Thieulot ◽  
Zdenek Martinec ◽  
...  

Abstract. Several alternative gravity forward modelling methodologies and associated numerical codes with their own advantages and limitations are available for the Solid Earth community. With the upcoming state-of-the-art lithosphere density models and accurate global gravity field data sets it is vital to understand the opportunities and limitations of the various approaches. In this paper, we discuss the four widely used techniques: global spherical harmonics (GSH), tesseroid integration (TESS), triangle integration (TRI), and hexahedral integration (HEX). A constant density shell benchmark shows that all four codes can produce similar precise gravitational potential fields. Two additional shell tests were conducted with more complicated density structures: lateral varying density structures and a Moho density interface between crust and mantle. The differences between the four codes were all below 1.5 percent of the modeled gravity signal suitable for reproducing satellite-acquired gravity data. TESS and GSH produced the most similar potential fields (< 0.3 percent). To examine the usability of the forward modelling codes for realistic geological structures, we use the global lithosphere model WINTERC-G, that was constrained, among other data, by satellite gravity field data computed using a spectral forward modeling approach. This spectral code was benchmarked against the GSH and it was confirmed that both approaches produce similar gravity solution with negligible differences between them. In the comparison of the different WINTERC-G-based gravity solutions, again GSH and TESS performed best. Only short-wavelength noise is present between the spectral and tesseroid forward modelling approaches, likely related to the different way in which the spherical harmonic analysis of the varying boundaries of the mass layer is performed. The Spherical harmonic basis functions produces small differences compared to the tesseroid elements especially at sharp interfaces, which introduces mostly short-wavelength differences. Nevertheless, both approaches (GSH and TESS) result in accurate solutions of the potential field with reasonable computational resources. Differences below 0.5 percent are obtained, resulting in residuals of 0.076 mGal standard deviation at 250 km height. The biggest issue for TRI is the characteristic pattern in the residuals that is related to the grid layout. Increasing the resolution and filtering allows for the removal of most of this erroneous pattern, but at the expense of higher computational loads with respect to the other codes. The other spatial forward modelling scheme HEX has more difficulty in reproducing similar gravity field solutions compared to GSH and TESS. These particular approaches need to go to higher resolutions, resulting in enormous computation efforts. The hexahedron-based code performs less than optimal in the forward modelling of the gravity signature, especially of a lateral varying density interface. Care must be taken with any forward modelling software as the approximation of the geometry of the WINTERC-G model may deteriorate the gravity field solution.


Geophysics ◽  
1996 ◽  
Vol 61 (6) ◽  
pp. 1702-1714 ◽  
Author(s):  
Jorge W. D. Leão ◽  
Paulo T. L. Menezes ◽  
Jacira F. Beltrão ◽  
João B. C. Silva

We present an interpretation method for the gravity anomaly of an arbitrary interface separating two homogeneous media. It consists essentially of a downward continuation of the observed anomaly and the division of the continued anomaly by a scale factor involving the density contrast between the media. The knowledge of the interface depth at isolated points is used to estimate the depth [Formula: see text] of the shallowest point of the interface, the density contrast Δρ between the two media, and the coefficients [Formula: see text] and [Formula: see text] of a first‐order polynomial representing a linear trend to be removed from data. The solutions are stabilized by introducing a damping parameter in the computation of the downward‐continued anomaly by the equivalent layer method. Different from other interface mapping methods using gravity data, the proposed method: (1) takes into account the presence of an undesirable linear trend in data; (2) requires just intervals for both Δρ (rather than the knowledge of its true value) and coefficients [Formula: see text] and [Formula: see text]; and (3) does not require the knowledge of the average interface depth [Formula: see text]. As a result of (3), the proposed method does not call for extensive knowledge of the interface depth to obtain a statistically significant estimate of [Formula: see text]; rather, it is able to use the knowledge of the interface depth at just a few isolated points to estimate [Formula: see text], Δρ, [Formula: see text], and [Formula: see text]. Tests using synthetic data confirm that the method produces good and stable estimates as far as the established premises (smooth interface separating two homogeneous media and, at most, the presence of an unremoved linear trend in data) are not violated. If the density contrast is not uniform, the method may still be applied using Litinsky’s concept of effective density. The method was applied to gravity data from Recôncavo Basin, Brazil, producing good correlations of estimated lows and terraces in the basement with corresponding known geological features.


Geophysics ◽  
2010 ◽  
Vol 75 (3) ◽  
pp. I21-I28 ◽  
Author(s):  
Cristiano M. Martins ◽  
Valeria C. Barbosa ◽  
João B. Silva

We have developed a gravity-inversion method for simultaneously estimating the 3D basement relief of a sedimentary basin and the parameters defining a presumed parabolic decay of the density contrast with depth in a sedimentary pack, assuming prior knowledge about the basement depth at a few points. The sedimentary pack is approximated by a grid of 3D vertical prisms juxtaposed in both horizontal directions of a right-handed coordinate system. The prisms’ thicknesses represent the depths to the basement and are the parameters to be estimated from the gravity data. To estimate the parameters defining the parabolic decay of the density contrast with depth and to produce stable depth-to-basement estimates, we imposed smoothness on the basement depths and proximity between estimated and known depths at boreholes. We applied our method to synthetic data from a simulated complex 3D basement relief with two sedimentary sections having distinct parabolic laws describing the density-contrast variation with depth. The results provide good estimates of the true parameters of the parabolic law of density-contrast decay with depth and of the basement relief. Inverting the gravity data from the onshore and part of the shallow offshore Almada Basin on Brazil’s northeastern coast shows good correlation with known structural features.


Author(s):  
V. Mizuhira ◽  
Y. Futaesaku

Previously we reported that tannic acid is a very effective fixative for proteins including polypeptides. Especially, in the cross section of microtubules, thirteen submits in A-tubule and eleven in B-tubule could be observed very clearly. An elastic fiber could be demonstrated very clearly, as an electron opaque, homogeneous fiber. However, tannic acid did not penetrate into the deep portion of the tissue-block. So we tried Catechin. This shows almost the same chemical natures as that of proteins, as tannic acid. Moreover, we thought that catechin should have two active-reaction sites, one is phenol,and the other is catechole. Catechole site should react with osmium, to make Os- black. Phenol-site should react with peroxidase existing perhydroxide.


Author(s):  
Yusroh Yusroh ◽  
Mohd. Zaki Abd. Rahman

Muḥammad Saʻīd Al-‘Ashmāwī and Muḥammad Shaḥrūr are well known as contemporary Muslim thinkers. This article tries to map their contemporary ideas on Islamic jurisprudence. The main data of this research taken mainly from the works both of Al-‘Ashmāwī and Shaḥrūr. In particular, the paper tries to analyze Al-‘Ashmāwī‘s ideas on sharia, politics, hijab, marriage and divorce. On the other hand, the ideas of Shahrour on al-Qur'an, Sunnah and Fiqh, the theory of borders, pluralism, the commandment, inheritance, hijab, marriage, divorce, dowry, politics, and imamate are also critizised. After analyzing their lives and their ideas on Islamic jurisprudence, the paper found that their social, educational and practical backgrounds have affected their intellectual formations and ideas. Ashmawi is encouraged by diligence and enlightenment and is believed to be enlightened. Shahrour, however, takes a new approach in order to create the ḥudūd theory as a new way. As well as their intellectual background, Ashmawi has a good queen in Arabic, English and French as well as religion, Sharia, jurisprudence and theology. Shahrour is a good queen in Arabic, English, Russian, philosophy, philology and historical language.


Author(s):  
Anna Varnayeva

Coordinative constructions are traditionally opposed to subordinative constructions. However, this opposition comes down to denial of dependence in coordinative constructions. Thereby the parity of these two constructions does not come to light: subordinative construction can be described without coordinative one. This situation is not improved by detection of a coordinative triangle in all coordinative constructions. The article shows a new approach in the study of coordinative constructions: a coordinative construction is a system; there are not only specific relations – a coordinative triangle, – but also specific elements. Novelty of the study consists in the address to extralinguistic facts, viz. a mathematical concept of a set and its elements. There are a lot of similarities between them. A set in mathematics includes generalizing elements and the composed row in coordinative constructions; in the first case the set is not partitioned, in the second case it is partitioned. In mathematics equivalent components in coordinative constructions correspond to the set elements. A characteristic property in mathematics is homogeneity in coordinative constructions and etc. It is firstly demonstrated, that coordinative and subordinative constructions are correlative and the study of one construction is impossible without the study of the other one. Their parity is shown in coordinative constructions with elements of one set, in subordinative ones with elements of different sets. Cf.: roses and tulips –red roses. In the coordinatiму construction elements of one set are called: «flowers »; in the subordinative construction there are elements of different sets: «flowers » and «colors». It should be noted that the mathematical concept of a set relates to so called logical aspect in linguistics or thinking about reality.


Author(s):  
M. John Plodinec

Abstract Over the last decade, communities have become increasingly aware of the risks they face. They are threatened by natural disasters, which may be exacerbated by climate change and the movement of land masses. Growing globalization has made a pandemic due to the rapid spread of highly infectious diseases ever more likely. Societal discord breeds its own threats, not the least of which is the spread of radical ideologies giving rise to terrorism. The accelerating rate of technological change has bred its own social and economic risks. This widening spectrum of risk poses a difficult question to every community – how resilient will the community be to the extreme events it faces. In this paper, we present a new approach to answering that question. It is based on the stress testing of financial institutions required by regulators in the United States and elsewhere. It generalizes stress testing by expanding the concept of “capital” beyond finance to include the other “capitals” (e.g., human, social) possessed by a community. Through use of this approach, communities can determine which investments of its capitals are most likely to improve its resilience. We provide an example of using the approach, and discuss its potential benefits.


Sign in / Sign up

Export Citation Format

Share Document