scholarly journals Creating realistic models based on combined forward modeling and tomographic inversion of seismic profiling data

Geophysics ◽  
2010 ◽  
Vol 75 (3) ◽  
pp. B115-B136 ◽  
Author(s):  
Ivan Koulakov ◽  
Tatiana Stupina ◽  
Heidrun Kopp

Amplitudes and shapes of seismic patterns derived from tomographic images often are strongly biased with respect to real structures in the earth. In particular, tomography usually provides continuous velocity distributions, whereas major velocity changes in the earth often occur on first-order interfaces. We propose an approach that constructs a realistic structure of the earth that combines forward modeling and tomographic inversion (FM&TI). Using available a priori information, we first construct a synthetic model with realistic patterns. Then we compute synthetic times and invert them using the same tomographic code and the same parameters as in the case of observed data processing. We compare the reconstruction result with the tomographicimage of observed data inversion. If a discrepancy is observed, we correct the synthetic model and repeat the FM&TI process. After several trials, we obtain similar results of synthetic and observed data inversion. In this case, the derived synthetic model adequately represents the real structure of the earth. In a working scheme of this approach, we three authors used two different synthetic models with a realistic setup. One of us created models, but the other two performed the reconstruction with no knowledge of the models. We discovered that the synthetic models derived by FM&TI were closer to the true model than the tomographic inversion result. Our reconstruction results from modeling marine data acquired in the Musicians Seamount Province in the Pacific Ocean indicate the capacity and limitations of FM&TI.

Author(s):  
M. A. Maksimov ◽  
I. V. Surodina ◽  
V. N. Glinskikh

High-performance algorithm for an unmanned magnetometer complex data inversion is proposed. We provide numerical tests for tomographic inversion of data for the different-height magnetic surveying. We show the results of calculations for a test model with a magnetic object in the form of a cylinder of finite height. We propose the recommendations for the method application for the inversion of the observed data.


Geophysics ◽  
2010 ◽  
Vol 75 (1) ◽  
pp. I1-I10 ◽  
Author(s):  
Pejman Shamsipour ◽  
Denis Marcotte ◽  
Michel Chouteau ◽  
Pierre Keating

A new application has been developed, based on geostatistical techniques of cokriging and conditional simulation, for the 3D inversion of gravity data including geologic constraints. The necessary gravity, density, and gravity-density covariance matrices are estimated using the observed gravity data. Then the densities are cokriged or simulated using the gravity data as the secondary variable. The model allows noise to be included in the observations. The method is applied to two synthetic models: a short dipping dike and a stochastic distribution of densities. Then some geologic information is added as constraints to the cokriging system. The results show the ability of the method to integrate complex a priori information. The survey data of the Matagami mining camp are considered as a case study. The inversion method based on cokriging is applied to the residual anomaly to map the geology through the estimation of the density distribution in this region. The results of the inversion and simulation methods are in good agreement with the surface geology of the survey region.


Geophysics ◽  
1991 ◽  
Vol 56 (12) ◽  
pp. 2008-2018 ◽  
Author(s):  
Marc Lavielle

Inverse problems can be solved in different ways. One way is to define natural criteria of good recovery and build an objective function to be minimized. If, instead, we prefer a Bayesian approach, inversion can be formulated as an estimation problem where a priori information is introduced and the a posteriori distribution of the unobserved variables is maximized. When this distribution is a Gibbs distribution, these two methods are equivalent. Furthermore, global optimization of the objective function can be performed with a Monte Carlo technique, in spite of the presence of numerous local minima. Application to multitrace deconvolution is proposed. In traditional 1-D deconvolution, a set of uni‐dimensional processes models the seismic data, while a Markov random field is used for 2-D deconvolution. In fact, the introduction of a neighborhood system permits one to model the layer structure that exists in the earth and to obtain solutions that present lateral coherency. Moreover, optimization of an appropriated objective function by simulated annealing allows one to control the fit with the input data as well as the spatial distribution of the reflectors. Extension to 3-D deconvolution is straightforward.


Geophysics ◽  
1999 ◽  
Vol 64 (4) ◽  
pp. 1116-1125 ◽  
Author(s):  
Gualtiero Böhm ◽  
Aldo L. Vesnaver

The possible nonuniqueness and inaccuracy of tomographic inversion solutions may be the result of an inadequate discretization of the model space with respect to the acquisition geometry and the velocity field sought. Void pixels and linearly dependent equations are introduced if the grid shape does not match the spatial distribution of rays, originating the well‐known null space. This is a common drawback when using regular pixels. By definition, the null space does not depend on the picked traveltimes, and so we cannot eliminate it by minimising the traveltime residuals. We show that the inversion quality can be improved by following a trial and error approach, that is, by adapting the pixels’ shape and distribution to the layer interfaces and velocity field. The resolution can be increased or decreased locally to search for an optimal grid, although this introduces a personal bias. On the other hand, we can so decide where, why, and which a priori information is introduced in the sought velocity field, which is hardly feasible by managing other stabilising tools such as damping factors and smoothing filters.


Author(s):  
Maria A. Milkova

Nowadays the process of information accumulation is so rapid that the concept of the usual iterative search requires revision. Being in the world of oversaturated information in order to comprehensively cover and analyze the problem under study, it is necessary to make high demands on the search methods. An innovative approach to search should flexibly take into account the large amount of already accumulated knowledge and a priori requirements for results. The results, in turn, should immediately provide a roadmap of the direction being studied with the possibility of as much detail as possible. The approach to search based on topic modeling, the so-called topic search, allows you to take into account all these requirements and thereby streamline the nature of working with information, increase the efficiency of knowledge production, avoid cognitive biases in the perception of information, which is important both on micro and macro level. In order to demonstrate an example of applying topic search, the article considers the task of analyzing an import substitution program based on patent data. The program includes plans for 22 industries and contains more than 1,500 products and technologies for the proposed import substitution. The use of patent search based on topic modeling allows to search immediately by the blocks of a priori information – terms of industrial plans for import substitution and at the output get a selection of relevant documents for each of the industries. This approach allows not only to provide a comprehensive picture of the effectiveness of the program as a whole, but also to visually obtain more detailed information about which groups of products and technologies have been patented.


1965 ◽  
Vol 20 (1) ◽  
pp. 147-151
Author(s):  
Pierre Vidal-Naquet
Keyword(s):  
A Priori ◽  

Sur la couverture du volume d'Etudes archéologiques dirigé par Paul COURBIN, on distingue le quadrillage d'une fouille moderne, celle qu'a organisée P. Courbin à Argos lorsqu'il était secrétaire général de l'École d'Athènes. L'ouvrage se présente donc comme un manifeste au service d'une nouvelle conception de l'archéologie, l'ancienne étant celle qui « suivait les murs » afin de dégager au plus vite le plan des édifices, recueillait un peu en vrac le matériel et, malgré tous les progrès accomplis depuis la Renaissance, persistait à « chasser » l'objet. L'évolution dans ces domaines a été plus rapide à l'étranger, dans les pays anglo-saxons notamment, qu'en France, et c'est principalement à Sir Mortimer Weelher, dont l'ouvrage Archeology from the Earth, suffisamment célèbre outre Manche pour avoir été republié en livre de poche, demeure la bible de l'archéologue-stratigraphe, que l'on doit la mise au point de la méthode dite du quadrillage, ou des carrés, méthode du reste infiniment plus souple qu'on ne pouvait le supposer a priori. Mais bien que les Études archéologiques soient avant tout un recueil méthodologique, il s'agit dans ce livre d'infiniment plus que d'une initiation aux techniques modernes. Comme le montre bien P. Courbin dans sa préface, l'ouvrage s'adresse aussi aux historiens mis au défi, sur un ton parfois un peu provocant, sommés de s'adapter ou de périr. On songe au choc qu'ont dû ressentir les lecteurs des premiers travaux d'Ernest Labrousse sur l'histoire des prix au XVIIIe siècle.


Sign in / Sign up

Export Citation Format

Share Document