Spiking band‐limited traces with a relative‐entropy algorithm

Geophysics ◽  
1991 ◽  
Vol 56 (7) ◽  
pp. 1003-1014 ◽  
Author(s):  
F. J. Jacobs ◽  
P. A. G. van der Geest

A novel method for the inversion of band‐limited seismic traces to full bandwidth reflectivity traces, is based on a probabilistic spiky model of the reflectivity trace, in which position indicators and amplitudes of the spikes occur as random variables, and relies on relative entropy inference from information theory. First, an a priori model for general reflectivity traces in the prospect is derived from nearby wells. Second, the a priori distribution is updated into an a posteriori distribution for the specific trace being studied by the addition of the Fourier data of the seismic trace within a passband. Uncertainty about the Fourier coefficients can be accounted for by specification of a noise variance, which implicitly is infinite outside the passband. The update with relative entropy inference is justified because of its relationship with Bayesian inference. Application of maximum a posteriori (MAP) estimation to the a posteriori distribution results in the most likely spiky reflectivity trace of full bandwidth. A numerical algorithm for obtaining the MAP estimates of spike positions and spike amplitudes is derived from the concept of continuation and is described in detail. The algorithm avoids searching among all possible patterns of spike positions.

2019 ◽  
Vol 294 ◽  
pp. 03012 ◽  
Author(s):  
Alexander Trofimov ◽  
Albina Kuzmenko ◽  
Halyna Nesterenko ◽  
Svitlana Avramenko ◽  
Mykhailo Muzykin ◽  
...  

The evaluation of the parameters of multi-layered foundations (railroad basis, foundations of railway structures, etc.) plays an important role in ensuring the safe movement of trains. The method of estimating the mechanical and geometric parameters of such foundations based on the solutions of inverse problems for multi-layered elastic packets is proposed. As input data for such problems the measured displacements of certain points on the package surface are used. The method allows estimating the parameters of the a priori distribution of unknown variable parameters, identifying and excluding outliers of the measured data from the created model, and constructing a posteriori estimation of the unknown parameters probability density with acceptable resolution. Proposed method can be used to create a new generation of equipment intended for non-destructive monitoring and estimating of the condition of the railroad basis and the foundations of artificial structures. The appropriate software of such vehicles based on the developed methods of data processing can be developed. The use of such equipment allows to operatively analyzing the state of individual areas of the railroad to decide on the need of repairing or replacing the railroad base or foundation of other elements of railroad infrastructure.


2011 ◽  
Vol 11 (1) ◽  
pp. 75-82 ◽  
Author(s):  
Kosnazar Sharipov

AbstractWe consider the classical ill-posed problem of the recovery of continuous functions from noisy Fourier coefficients. For the classes of functions given in terms of generalized smoothness, we present a priori and a posteriori regularization parameter choice realizing an order-optimal error bound.


Geophysics ◽  
1995 ◽  
Vol 60 (4) ◽  
pp. 1169-1177 ◽  
Author(s):  
Mauricio D. Sacchi ◽  
Tadeusz J. Ulrych

We present a high‐resolution procedure to reconstruct common‐midpoint (CMP) gathers. First, we describe the forward and inverse transformations between offset and velocity space. Then, we formulate an underdetermined linear inverse problem in which the target is the artifacts‐free, aperture‐compensated velocity gather. We show that a sparse inversion leads to a solution that resembles the infinite‐aperture velocity gather. The latter is the velocity gather that should have been estimated with a simple conjugate operator designed from an infinite‐aperture seismic array. This high‐resolution velocity gather is then used to reconstruct the offset space. The algorithm is formally derived using two basic principles. First, we use the principle of maximum entropy to translate prior information about the unknown parameters into a probabilistic framework, in other words, to assign a probability density function to our model. Second, we apply Bayes’s rule to relate the a priori probability density function (pdf) with the pdf corresponding to the experimental uncertainties (likelihood function) to construct the a posteriori distribution of the unknown parameters. Finally the model is evaluated by maximizing the a posteriori distribution. When the problem is correctly regularized, the algorithm converges to a solution characterized by different degrees of sparseness depending on the required resolution. The solutions exhibit minimum entropy when the entropy is measured in terms of Burg’s definition. We emphasize two crucial differences in our approach with the familiar Burg method of maximum entropy spectral analysis. First, Burg’s entropy is minimized rather than maximized, which is equivalent to inferring as much as possible about the model from the data. Second, our approach uses the data as constraints in contrast with the classic maximum entropy spectral analysis approach where the autocorrelation function is the constraint. This implies that we recover not only amplitude information but also phase information, which serves to extrapolate the data outside the original aperture of the array. The tradeoff is controlled by a single parameter that under asymptotic conditions reduces the method to a damped least‐squares solution. Finally, the high‐resolution or aperture‐compensated velocity gather is used to extrapolate near‐ and far‐offset traces.


Author(s):  
David Kipping

Abstract Astronomy has always been propelled by the discovery of new phenomena lacking precedent, often followed by new theories to explain their existence and properties. In the modern era of large surveys tiling the sky at ever high precision and sampling rates, these serendipitous discoveries look set to continue, with recent examples including Boyajian’s Star, Fast Radio Bursts and ‘Oumuamua. Accordingly, we here look ahead and aim to provide a statistical framework for interpreting such events and providing guidance to future observations, under the basic premise that the phenomenon in question stochastically repeat at some unknown, constant rate, λ. Specifically, expressions are derived for 1) the a-posteriori distribution for λ, 2) the a-posteriori distribution for the recurrence time, and, 3) the benefit-to-cost ratio of further observations relative to that of the inaugural event. Some rule-of-thumb results for each of these are found to be 1) $\lambda < \lbrace 0.7, 2.3, 4.6\rbrace \, t_1^{-1}$ to $\lbrace 50, 90, 95\rbrace {{\ \rm per\ cent}}$ confidence (where t1 = time to obtain the first detection), 2) the recurrence time is t2 < {1, 9, 99} t1 to $\lbrace 50, 90, 95\rbrace {{\ \rm per\ cent}}$ confidence, with a lack of repetition by time t2 yielding a p-value of 1/[1 + (t2/t1)], and, 3) follow-up for ≲ 10 t1 is expected to be scientifically worthwhile under an array of differing assumptions about the object’s intrinsic scientific value. We apply these methods to the Breakthrough Listen Candidate 1 signal and tidal disruption events observed by TESS.


2018 ◽  
Vol 12 (4) ◽  
pp. 245 ◽  
Author(s):  
Luiz Henrique Marra da Silva Ribeiro ◽  
Matheus De Souza Costa ◽  
Luiz Alberto Beijo ◽  
Alberto Frank Lázaro Aguirre ◽  
Tatiane Gomes de Araújo ◽  
...  

The Bayesian approach in regression models has shown good results in parameter estimations, where it can increase accuracy and precision. The objective of the current study was to analyze the application of Bayesian statistics to the modeling yield for leaf dry matter (LM) and stem (SM), in kg ha-1, leaf ratio (LR), crude protein content for leaves (CPL) and stem (CPS) (%) of Brachiaria grass as a function of varying N doses (0; 100; 200 and 300 kg ha-1 yr-1). Simple and two degree polynomial linear regression models were analyzed. Information for a priori distributions was obtained from the literature. A posteriori distribution was generated using a Monte Carlo method via Markov chains. Parameters significance was assyed with HPD (Highest Posteriori Density) with a 95% interval. Model selections was performed using DIC (Deviance Information Criterion); and adjustment quality estimated with means and 95% HPD for Bayesian R2 distribution ranges. The models selected for the variables LM, SM and CPS were linear, while for LR and CPL, they were second level polynomial. The lowest doses that maximize response variables were: LM: 274 ha-1yr-1, SM: 280 ha-1yr-1, LR: 113 ha-1yr-1, CPL: 265 ha-1yr-1, CPS: 289 ha-1yr-1. The Bayesian approach allowed the inclusion of literatureverified a priori information, and the identification of evidence optimization range intervals.


Geophysics ◽  
1991 ◽  
Vol 56 (12) ◽  
pp. 2008-2018 ◽  
Author(s):  
Marc Lavielle

Inverse problems can be solved in different ways. One way is to define natural criteria of good recovery and build an objective function to be minimized. If, instead, we prefer a Bayesian approach, inversion can be formulated as an estimation problem where a priori information is introduced and the a posteriori distribution of the unobserved variables is maximized. When this distribution is a Gibbs distribution, these two methods are equivalent. Furthermore, global optimization of the objective function can be performed with a Monte Carlo technique, in spite of the presence of numerous local minima. Application to multitrace deconvolution is proposed. In traditional 1-D deconvolution, a set of uni‐dimensional processes models the seismic data, while a Markov random field is used for 2-D deconvolution. In fact, the introduction of a neighborhood system permits one to model the layer structure that exists in the earth and to obtain solutions that present lateral coherency. Moreover, optimization of an appropriated objective function by simulated annealing allows one to control the fit with the input data as well as the spatial distribution of the reflectors. Extension to 3-D deconvolution is straightforward.


Author(s):  
Heinrich Schepers ◽  
Giorgio Tonelli ◽  
Rudolf Eisler
Keyword(s):  
A Priori ◽  

Sign in / Sign up

Export Citation Format

Share Document