Problems with the U.S. Part of the DNAG magnetic data set: Limitations for digital filtering

1992 ◽  
Author(s):  
V. J. S. Grauch
Geophysics ◽  
1993 ◽  
Vol 58 (9) ◽  
pp. 1281-1296 ◽  
Author(s):  
V. J. S. Grauch

The magnetic data set compiled for the Decade of North American Geology (DNAG) project presents an important digital data base that can be used to examine the North American crust. The data represent a patchwork from many individual airborne and marine magnetic surveys. However, the portion of data for the conterminous U.S. has problems that limit the resolution and use of the data. Now that the data are available in digital form, it is important to describe the data limitations more specifically than before. The primary problem is caused by datum shifts between individual survey boundaries. In the western U.S., the DNAG data are generally shifted less than 100 nT. In the eastern U.S., the DNAG data may be shifted by as much as 300 nT and contain regionally shifted areas with wavelengths on the order of 800 to 1400 km. The worst case is the artificial low centered over Kentucky and Tennessee produced by a series of datum shifts. A second significant problem is lack of anomaly resolution that arises primarily from using survey data that is too widely spaced compared to the flight heights above magnetic sources. Unfortunately, these are the only data available for much of the U.S. Another problem is produced by the lack of common observation surface between individual pieces of the U.S. DNAG data. The height disparities introduce variations in spatial frequency content that are unrelated to the magnetization of rocks. The spectral effects of datum shifts and the variation of spatial frequency content due to height disparities were estimated for the DNAG data for the conterminous U.S. As a general guideline for digital filtering, the most reliable features in the U.S. DNAG data have wavelengths roughly between 170 and 500 km, or anomaly half‐widths between 85 and 250 km. High‐quality, large‐region magnetic data sets have become increasingly important to meet exploration and scientific objectives. The acquisition of a new national magnetic data set with higher quality at a greater range of wavelengths is clearly in order. The best approach is to refly much of the U.S. with common specifications and reduction procedures. At the very least, magnetic data sets should be remerged digitally using available or newly flown long‐distance flight‐line data to adjust survey levels. In any case, national coordination is required to produce a consistent, high‐quality national magnetic map.


2008 ◽  
Vol 40 (1) ◽  
pp. 301-313 ◽  
Author(s):  
Jeffrey M. Gillespie ◽  
Wayne Wyatt ◽  
Brad Venuto ◽  
David Blouin ◽  
Robert Boucher

Comparisons are made concerning labor required and profitability associated with continuous grazing at three stocking rates and rotational grazing at a high stocking rate in the U.S. Gulf Coast region. A unique data set was collected using a time and motion study method to determine labor requirements. Profits are lowest for low stocking rate–continuous grazing and high stocking rate–rotational grazing. Total labor and labor in three specific categories are greater on per acre and/or per cow bases with rotational-grazing than with continuous-grazing strategies. These results help to explain relatively low adoption rates of rotational grazing in the region.


Geophysics ◽  
2014 ◽  
Vol 79 (1) ◽  
pp. IM1-IM9 ◽  
Author(s):  
Nathan Leon Foks ◽  
Richard Krahenbuhl ◽  
Yaoguo Li

Compressive inversion uses computational algorithms that decrease the time and storage needs of a traditional inverse problem. Most compression approaches focus on the model domain, and very few, other than traditional downsampling focus on the data domain for potential-field applications. To further the compression in the data domain, a direct and practical approach to the adaptive downsampling of potential-field data for large inversion problems has been developed. The approach is formulated to significantly reduce the quantity of data in relatively smooth or quiet regions of the data set, while preserving the signal anomalies that contain the relevant target information. Two major benefits arise from this form of compressive inversion. First, because the approach compresses the problem in the data domain, it can be applied immediately without the addition of, or modification to, existing inversion software. Second, as most industry software use some form of model or sensitivity compression, the addition of this adaptive data sampling creates a complete compressive inversion methodology whereby the reduction of computational cost is achieved simultaneously in the model and data domains. We applied the method to a synthetic magnetic data set and two large field magnetic data sets; however, the method is also applicable to other data types. Our results showed that the relevant model information is maintained after inversion despite using 1%–5% of the data.


2020 ◽  
Vol 223 (2) ◽  
pp. 1378-1397
Author(s):  
Rosemary A Renaut ◽  
Jarom D Hogue ◽  
Saeed Vatankhah ◽  
Shuang Liu

SUMMARY We discuss the focusing inversion of potential field data for the recovery of sparse subsurface structures from surface measurement data on a uniform grid. For the uniform grid, the model sensitivity matrices have a block Toeplitz Toeplitz block structure for each block of columns related to a fixed depth layer of the subsurface. Then, all forward operations with the sensitivity matrix, or its transpose, are performed using the 2-D fast Fourier transform. Simulations are provided to show that the implementation of the focusing inversion algorithm using the fast Fourier transform is efficient, and that the algorithm can be realized on standard desktop computers with sufficient memory for storage of volumes up to size n ≈ 106. The linear systems of equations arising in the focusing inversion algorithm are solved using either Golub–Kahan bidiagonalization or randomized singular value decomposition algorithms. These two algorithms are contrasted for their efficiency when used to solve large-scale problems with respect to the sizes of the projected subspaces adopted for the solutions of the linear systems. The results confirm earlier studies that the randomized algorithms are to be preferred for the inversion of gravity data, and for data sets of size m it is sufficient to use projected spaces of size approximately m/8. For the inversion of magnetic data sets, we show that it is more efficient to use the Golub–Kahan bidiagonalization, and that it is again sufficient to use projected spaces of size approximately m/8. Simulations support the presented conclusions and are verified for the inversion of a magnetic data set obtained over the Wuskwatim Lake region in Manitoba, Canada.


2021 ◽  
Author(s):  
Lemgharbi Abdenaceur ◽  
Hamoudi Mohamed ◽  
Abtout Abdeslam ◽  
Abdelhamid Bendekken ◽  
Ener Aganou ◽  
...  

<p>In order to understand the spatial and temporal behavior of the Earth's magnetic field, scientists, following C.F. Gauss initiative in 1838 have established observatories around the world. More than 200 observatories aiming to continuously record, the time variations of the magnetic field vector and to maintain the best standard of the accuracy and resolution of the measurements.</p><p>This study focused on the acquisition and analysis of the magnetic data provided by the Algerian magnetic observatory of Tamanrasset (labelled TAM by the International Association of Geomagnetism and Aeronomy). This observatory is located in southern Algeria at 5.53°E longitude, 22.79°N Latitude. Its altitude is 1373 meters above msl. TAM is continuously running since 1932, using old brand variometers, like Mascart and La Cour with photographic recording at the very beginning. Nowadays modern electronic equipment are used in the framework of INTERMAGNET project. Very large geomagnetic database collected over a century is available. We will describe the history and the various improvement of the methods and instrumentation.</p><p>Preliminary analysis of time series of the observatory data allowed to distinguish two kinds of data: the first type, with low resolution, collected between 1932 and 1992. This data set comes from the annual, monthly, daily and hourly means. The second one with high resolution is represented by minutes and seconds sampling rate since 1993 when TAM was integrated to the world observatory network, INTERMAGNET. Part of the second dataset contains many gaps. We try to fill these gaps thanks to mathematical methods. Absolute measurements and repeat station data allow better accuracy in the secular variations and an improved regional model.</p><p>Keywords: TAM observatory, temporal variation, terrestrial magnetic field, secular variations, INTERMAGNET.</p>


Author(s):  
Paul Caster ◽  
Randal J. Elder ◽  
Diane J. Janvrin

This exploratory study examines automation of the bank confirmation process using longitudinal data set from the largest third-party U.S. confirmation service provider supplemented with informal interviews with practitioners. We find a significant increase in electronic confirmation use in the U.S. and internationally. Errors requiring reconfirmation were less than two percent of all electronic confirmations. Errors made by auditors were almost five times more likely than errors by bank employees. Most auditor errors involved use of an invalid account number, although invalid client contact, invalid request, and invalid company name errors increased recently. Big 4 auditors made significantly more confirmation errors than did auditors at non-Big 4 national firms. Error rates and error types do not vary between confirmations initiated in the U.S. and those initiated internationally. Three themes emerged for future research: authentication of evidence, global differences in technology use, and technology adoption across firms of different sizes.


2019 ◽  
Vol 7 (2) ◽  
pp. T331-T345 ◽  
Author(s):  
Jiayong Yan ◽  
Xiangbin Chen ◽  
Guixiang Meng ◽  
Qingtian Lü ◽  
Zhen Deng ◽  
...  

Qiongheba is a polymetallic ore concentration area located in the east margin of the Junggar Basin in Xinjiang, Northwest China. Because all three main types of metal deposits (porphyry-type copper, skarn-type iron-copper, and structural altered rock-type gold deposits) in this area are controlled strictly by fault structures and intrusions buried under the Quaternary sediments, the detection of concealed faults and intrusions is of great significance for mineral prospecting. We aim to make clear the faults and intrusions based on the high-precision gravity and magnetic data set. First, multiscale edge detection of gravity and magnetic data is used to distinguish and divide the faults system. Second, 3D recognition of concealed intrusions combining with 3D inversion and multiscale edge detection of gravity and magnetic is carried out to construct the 3D formation of concealed intrusions. Last, seven prospecting targets are proposed based on our research and existed regional geologic and geochemical information, and two of them have been confirmed to be rich in polymetal (Cu-Fe-Mo-Au in the Layikeleke deposit and Cu in the Baxi deposit) by drilling. Our research results not only proved the effectiveness of the combination method of 3D inversion and multiscale edge detection of gravity and magnetic data in the prospecting of concealed faults and intrusions, but they also provide abundant information for mineral exploration prediction in the Qiongheba area.


2005 ◽  
Vol 2005 (1) ◽  
pp. 143-147
Author(s):  
Daniel R. Norton

ABSTRACT The annual volume of oil spilled into the marine environment by tank vessels (tank barges and tanks hips) is analyzed against the total annual volume of oil transported by tank vessels in order to determine any correlational relationship. U.S. Coast Guard data was used to provide the volume of oil (petroleum) spilled into the marine environment each year by tank vessels. Data from the U.S. Army Corps of Engineers and the U.S. Department of Transportation's (US DOT) National Transportation Statistics (NTS) were used for the annual volume of oil transported via tank vessels in the United States. This data is provided in the form of tonnage and ton-miles, respectively. Each data set has inherent benefits and weaknesses. For the analysis the volume of oil transported was used as the explanatory variable (x) and the volume of oil spilled into the marine environment as the response variable (y). Both data sets were tested for correlation. A weak relationship, r = −0.38 was found using tonnage, and no further analysis was performed. A moderately strong relationship, r = 0.79, was found using ton-miles. Further analysis using regression analysis and a plot of residuals showed the data to be satisfactory with no sign of lurking variables, but with the year 1990 being a possible outlier.


Sign in / Sign up

Export Citation Format

Share Document