Magnetic interpretation in three dimensions using Euler deconvolution

Geophysics ◽  
1990 ◽  
Vol 55 (1) ◽  
pp. 80-91 ◽  
Author(s):  
A. B. Reid ◽  
J. M. Allsop ◽  
H. Granser ◽  
A. J. Millett ◽  
I. W. Somerton

Magnetic‐survey data in grid form may be interpreted rapidly for source positions and depths by deconvolution using Euler’s homogeneity relation. The method employs gradients, either measured or calculated. Data need not be pole‐reduced, so that remanence is not an interfering factor. Geologic constraints are imposed by use of a structural index. Model studies show that the method can locate or outline confined sources, vertical pipes, dikes, and contacts with remarkable accuracy. A field example using data from an intensively studied area of onshore Britain shows that the method works well on real data from structurally complex areas and provides a series of depth‐labeled Euler trends which mark magnetic edges, notably faults, with good precision.

2014 ◽  
Vol 2 (4) ◽  
pp. SJ1-SJ8 ◽  
Author(s):  
Ahmed Salem ◽  
Richard Blakely ◽  
Chris Green ◽  
Derek Fairhead ◽  
Dhananjay Ravat

The local-wavenumber method estimates the depth to a magnetic source based on the spectral content of a single anomaly assuming that the base of the magnetic body is at infinite depth. However, the “infinite-depth” assumption can lead to significant underestimation of the depth to the top of magnetic bodies, especially in areas where the depth to the bottom of the magnetic layer is not large compared to the depth to the top, as would occur in high heat-flow regions and thinned continental margins. Such underestimation of depths has been demonstrated in model studies and using real data with seismic and well control. We evaluated a modification to the local-wavenumber approach to estimate the depth to the top of magnetic sources assuming that the depth to the bottom of the magnetic sources is controlled by the Curie temperature or crustal thickness. We applied this new method to a simple model of a continental margin and to magnetic survey data over the central Red Sea where the Curie isotherm is shallow. The effective structural index of this finite depth extent model is found to increase continuously from the continent to the ocean as the depth to the magnetic basement increases and the depth to the bottom of the magnetic layer decreases. We have also discovered in this study that the local-wavenumber maxima correlate well with major seafloor spreading magnetic reversal epochs in the central Red Sea segment.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
João Lobo ◽  
Rui Henriques ◽  
Sara C. Madeira

Abstract Background Three-way data started to gain popularity due to their increasing capacity to describe inherently multivariate and temporal events, such as biological responses, social interactions along time, urban dynamics, or complex geophysical phenomena. Triclustering, subspace clustering of three-way data, enables the discovery of patterns corresponding to data subspaces (triclusters) with values correlated across the three dimensions (observations $$\times$$ × features $$\times$$ × contexts). With increasing number of algorithms being proposed, effectively comparing them with state-of-the-art algorithms is paramount. These comparisons are usually performed using real data, without a known ground-truth, thus limiting the assessments. In this context, we propose a synthetic data generator, G-Tric, allowing the creation of synthetic datasets with configurable properties and the possibility to plant triclusters. The generator is prepared to create datasets resembling real 3-way data from biomedical and social data domains, with the additional advantage of further providing the ground truth (triclustering solution) as output. Results G-Tric can replicate real-world datasets and create new ones that match researchers needs across several properties, including data type (numeric or symbolic), dimensions, and background distribution. Users can tune the patterns and structure that characterize the planted triclusters (subspaces) and how they interact (overlapping). Data quality can also be controlled, by defining the amount of missing, noise or errors. Furthermore, a benchmark of datasets resembling real data is made available, together with the corresponding triclustering solutions (planted triclusters) and generating parameters. Conclusions Triclustering evaluation using G-Tric provides the possibility to combine both intrinsic and extrinsic metrics to compare solutions that produce more reliable analyses. A set of predefined datasets, mimicking widely used three-way data and exploring crucial properties was generated and made available, highlighting G-Tric’s potential to advance triclustering state-of-the-art by easing the process of evaluating the quality of new triclustering approaches.


2021 ◽  
Vol 8 ◽  
Author(s):  
Tianshu Gu ◽  
Lishi Wang ◽  
Ning Xie ◽  
Xia Meng ◽  
Zhijun Li ◽  
...  

The complexity of COVID-19 and variations in control measures and containment efforts in different countries have caused difficulties in the prediction and modeling of the COVID-19 pandemic. We attempted to predict the scale of the latter half of the pandemic based on real data using the ratio between the early and latter halves from countries where the pandemic is largely over. We collected daily pandemic data from China, South Korea, and Switzerland and subtracted the ratio of pandemic days before and after the disease apex day of COVID-19. We obtained the ratio of pandemic data and created multiple regression models for the relationship between before and after the apex day. We then tested our models using data from the first wave of the disease from 14 countries in Europe and the US. We then tested the models using data from these countries from the entire pandemic up to March 30, 2021. Results indicate that the actual number of cases from these countries during the first wave mostly fall in the predicted ranges of liniar regression, excepting Spain and Russia. Similarly, the actual deaths in these countries mostly fall into the range of predicted data. Using the accumulated data up to the day of apex and total accumulated data up to March 30, 2021, the data of case numbers in these countries are falling into the range of predicted data, except for data from Brazil. The actual number of deaths in all the countries are at or below the predicted data. In conclusion, a linear regression model built with real data from countries or regions from early pandemics can predict pandemic scales of the countries where the pandemics occur late. Such a prediction with a high degree of accuracy provides valuable information for governments and the public.


2021 ◽  
Author(s):  
Hiroshi Ishimoto ◽  
Masahiro Hayashi ◽  
Yuzo Mano

Abstract. Using data from the Infrared Atmospheric Sounding Interferometer (IASI) measurements of volcanic ash clouds and radiative transfer calculations, we identify the optimal refractive index model for simulating the measured brightness temperature spectrum of volcanic ash material. We assume that the optimal refractive index model has the smallest root mean square of the brightness temperature difference between measurements and simulations for channels in the wavenumber range of 750–1400 cm−1 and compare 21 refractive index models for optical properties of ash particles, including recently published models. From the results of numerical simulations for 164 pixels of IASI measurements for ash clouds from 11 volcanoes, we found that the measured brightness temperature spectrum could be well simulated using certain newly established refractive index models. In the cases of Eyjafjallajökull and Grímsvötn ash clouds, the optimal refractive index models determined through numerical simulation correspond to those deduced from the chemical composition of ash samples for the same volcanic eruption events. This finding suggests that infrared sounder measurement of volcanic ash clouds is an effective approach to estimating the optimal refractive index model. However, discrepancies between the estimated refractive index models based on satellite measurements and the associated volcanic rock types were observed for some volcanic events.


2014 ◽  
Vol 8 (2) ◽  
Author(s):  
Ahmed El-Mowafy ◽  
Congwei Hu

AbstractThis study presents validation of BeiDou measurements in un-differenced standalone mode and experimental results of its application for real data. A reparameterized form of the unknowns in a geometry-free observation model was used. Observations from each satellite are independently screened using a local modeling approach. Main advantages include that there is no need for computation of inter-system biases and no satellite navigation information are needed. Validation of the triple-frequency BeiDou data was performed in static and kinematic modes, the former at two continuously operating reference stations in Australia using data that span two consecutive days and the later in a walking mode for three hours. The use of the validation method parameters for numerical and graphical diagnostics of the multi-frequency BeiDou observations are discussed. The precision of the system’s observations was estimated using an empirical method that utilizes the characteristics of the validation statistics. The capability of the proposed method is demonstrated in detection and identification of artificial errors inserted in the static BeiDou data and when implemented in a single point positioning processing of the kinematic test.


Author(s):  
A. Stefanie Ruiz ◽  
Lili Wang ◽  
Femida Handy

This study investigates the association between the integration of first-generation immigrants and their volunteering. Using data from a Canadian national survey, we examine three dimensions of immigrant integration: professional, psychosocial and political. General volunteering is not significantly related to integration; however, there exists a relationship between the different dimensions of integration and where immigrants choose to volunteer. Thus, the relationship between the type and degree of immigrant integration and volunteering is nuanced; it matters where volunteering occurs.


Author(s):  
Carmen Friedrich ◽  
Henriette Engelhardt ◽  
Florian Schulz

Abstract Women in Middle Eastern and North African countries continue to report low levels of agency, despite their increasing educational attainment and declining fertility rates. We address this paradox by considering how women’s agency is linked to parenthood in Egypt, Jordan, and Tunisia and how this association is moderated by their level of education. We study three dimensions of instrumental agency: involvement in decision-making, financial autonomy, and freedom of movement using data for married women aged 18–49 from the Integrated Labor Market Panel Surveys: Egypt 2012 (n = 7622), Jordan 2016 (n = 4550), Tunisia 2014 (n = 1480). Results from multivariate regression models of these different dimensions demonstrate that married women who are mothers generally exhibit higher levels of agency than their counterparts who are childless, though this does not hold for every dimension and the strength of the association between parenthood and agency differs by dimension and country. We find a notable exception to this pattern of positive association in the Egyptian sample: parenthood decreases agency among Egyptian women with post-secondary education. Our results suggest that parenthood may only increase women’s agency in settings with deeply entrenched patriarchal norms that imply little education for women.


2019 ◽  
Vol 49 (4) ◽  
pp. 1147-1158 ◽  
Author(s):  
Jessica M B Rees ◽  
Christopher N Foley ◽  
Stephen Burgess

Abstract Background Factorial Mendelian randomization is the use of genetic variants to answer questions about interactions. Although the approach has been used in applied investigations, little methodological advice is available on how to design or perform a factorial Mendelian randomization analysis. Previous analyses have employed a 2 × 2 approach, using dichotomized genetic scores to divide the population into four subgroups as in a factorial randomized trial. Methods We describe two distinct contexts for factorial Mendelian randomization: investigating interactions between risk factors, and investigating interactions between pharmacological interventions on risk factors. We propose two-stage least squares methods using all available genetic variants and their interactions as instrumental variables, and using continuous genetic scores as instrumental variables rather than dichotomized scores. We illustrate our methods using data from UK Biobank to investigate the interaction between body mass index and alcohol consumption on systolic blood pressure. Results Simulated and real data show that efficiency is maximized using the full set of interactions between genetic variants as instruments. In the applied example, between 4- and 10-fold improvement in efficiency is demonstrated over the 2 × 2 approach. Analyses using continuous genetic scores are more efficient than those using dichotomized scores. Efficiency is improved by finding genetic variants that divide the population at a natural break in the distribution of the risk factor, or else divide the population into more equal-sized groups. Conclusions Previous factorial Mendelian randomization analyses may have been underpowered. Efficiency can be improved by using all genetic variants and their interactions as instrumental variables, rather than the 2 × 2 approach.


2020 ◽  
Vol 493 (1) ◽  
pp. 1120-1129
Author(s):  
Z Yan ◽  
N Raza ◽  
L Van Waerbeke ◽  
A J Mead ◽  
I G McCarthy ◽  
...  

ABSTRACT The location of a galaxy cluster’s centroid is typically derived from observations of the galactic and/or gas component of the cluster, but these typically deviate from the true centre. This can produce bias when observations are combined to study average cluster properties. Using data from the BAryons and HAloes of MAssive Systems (BAHAMAS) cosmological hydrodynamic simulations, we study this bias in both two and three dimensions for 2000 clusters over the 1013–1015 M⊙ mass range. We quantify and model the offset distributions between observationally motivated centres and the ‘true’ centre of the cluster, which is taken to be the most gravitationally bound particle measured in the simulation. We fit the cumulative distribution function of offsets with an exponential distribution and a Gamma distribution fit well with most of the centroid definitions. The galaxy-based centres can be seen to be divided into a mis-centred group and a well-centred group, with the well-centred group making up about $60{{\ \rm per\ cent}}$ of all the clusters. Gas-based centres are overall less scattered than galaxy-based centres. We also find a cluster-mass dependence of the offset distribution of gas-based centres, with generally larger offsets for smaller mass clusters. We then measure cluster density profiles centred at each choice of the centres and fit them with empirical models. Stacked, mis-centred density profiles fit to the Navarro–Frenk–White dark matter profile and Komatsu–Seljak gas profile show that recovered shape and size parameters can significantly deviate from the true values. For the galaxy-based centres, this can lead to cluster masses being underestimated by up to $10{{\ \rm per\ cent}}$.


Geophysics ◽  
2010 ◽  
Vol 75 (4) ◽  
pp. L79-L90 ◽  
Author(s):  
Daniela Gerovska ◽  
Marcos J. Araúzo-Bravo ◽  
Kathryn Whaler ◽  
Petar Stavrev ◽  
Alan Reid

We present an automatic procedure for interpretation of magnetic or gravity gridded anomalies based on the finite-difference similarity transform (FDST). It is called MaGSoundFDST (magnetic and gravity sounding based on the finite-difference similarity transform) and uses a “focusing” principle in contrast to deriving multiple clusters of many solutions as in the widely used Euler deconvolution method. The source parameters are characterized by isolated solutions, and the interpreter obtains parallel images showing the horizontal position, depth, and structural index [Formula: see text] value. The underlying principle is that the FDST of a potential field anomaly becomes zero or linear at all observation points when the central point of similarity (CPS) of the transform coincides with a source field’s singular point and a correct [Formula: see text] value is used. The procedure involves calculating a 3D function that evaluates the linearity of the FDST for a series of [Formula: see text] values, using a moving window and sounding the subsurface along a verticalline under each window center. We then combine the 3D results for different [Formula: see text] values into a single map whose minima determine the horizontal position of the sources. The [Formula: see text] value and the CPS depth associated with each minimum determine the [Formula: see text] value and depth of the corresponding source. Only one estimate characterizes a simple source, which is a major advantage over other window-based procedures. MaGSoundFDST uses only the measured anomalous field and its upward continuation, thus avoiding the direct use of field derivatives. It is independent of the magnetization-vector direction in the magnetic data case. The procedure accounts for a linear background of local gravity or magnetic anomalies and has been applied effectively to several cases of synthetic and real data. MaGSoundFDST shares common features with the magnetic and gravity sounding based on the differential similarity transform (MaGSoundDST) but is more stable in estimating depth and structural index in the presence of random noise.


Sign in / Sign up

Export Citation Format

Share Document