A Trade‐Off Solution of Regularized Geophysical Inversion Using Model Resolution and Covariance Matrices

2009 ◽  
Author(s):  
Jianghai Xia ◽  
Yixian Xu ◽  
Richard D. Miller ◽  
Chong Zeng
Geophysics ◽  
2006 ◽  
Vol 71 (6) ◽  
pp. R79-R90 ◽  
Author(s):  
Michael S. Zhdanov ◽  
Ekaterina Tolstaya

The existing techniques for appraisal of geophysical inverse images are based on calculating the model resolution and the model covariance matrices. In some applications, however, it becomes desirable to evaluate the upper bounds of the variations in the solution of the inverse problem. It is possible to use the Cauchy inequality for the regularized least-squares inversion to quantify the ability of an experiment to discriminate between two similar models in the presence of noise in the data. We present a new method for resolution analysis based on evaluating the spatial distribution of the upper bounds of the model variations and introduce a new characteristic of geophysical inversion, resolution density, as an inverse of these upper bounds. We derive an efficient numerical technique to compute the resolution density based on the spectral Lanczos decomposition method (SLDM). The methodology was tested on 3D synthetic linear and nonlinear electromagnetic (EM) data inversions, and also to interpret the helicopter-borne EM data collected by INCO Exploration in the Voisey’s Bay area of Canada.


2012 ◽  
Vol 235 ◽  
pp. 362-367
Author(s):  
Xun Wang ◽  
Jing Wang

This paper investigates the optimal forecasting method in reducing supply chain amplification, a.k.a., the Bullwhip Effect. By using a simple replenishing policy and solving Stein Matrix Equations, the relationship between input and output covariance matrices is derived. Both analytical and simulational results support the opinion that proportional forecasting is superior in bullwhip mitigation. There exists trade-off in reducing variance amplifications in order and inventory. For different control objectives, we propose optimal proportional controllers.


Biology ◽  
2021 ◽  
Vol 10 (8) ◽  
pp. 702
Author(s):  
Guillermo B. Morales ◽  
Miguel A. Muñoz

Shedding light on how biological systems represent, process and store information in noisy environments is a key and challenging goal. A stimulating, though controversial, hypothesis poses that operating in dynamical regimes near the edge of a phase transition, i.e., at criticality or the “edge of chaos”, can provide information-processing living systems with important operational advantages, creating, e.g., an optimal trade-off between robustness and flexibility. Here, we elaborate on a recent theoretical result, which establishes that the spectrum of covariance matrices of neural networks representing complex inputs in a robust way needs to decay as a power-law of the rank, with an exponent close to unity, a result that has been indeed experimentally verified in neurons of the mouse visual cortex. Aimed at understanding and mimicking these results, we construct an artificial neural network and train it to classify images. We find that the best performance in such a task is obtained when the network operates near the critical point, at which the eigenspectrum of the covariance matrix follows the very same statistics as actual neurons do. Thus, we conclude that operating near criticality can also have—besides the usually alleged virtues—the advantage of allowing for flexible, robust and efficient input representations.


2010 ◽  
Vol 167 (12) ◽  
pp. 1537-1547 ◽  
Author(s):  
Jianghai Xia ◽  
Yixian Xu ◽  
Richard D. Miller ◽  
Chong Zeng

1982 ◽  
Vol 14 (2) ◽  
pp. 109-113 ◽  
Author(s):  
Suleyman Tufekci
Keyword(s):  

2012 ◽  
Vol 11 (3) ◽  
pp. 118-126 ◽  
Author(s):  
Olive Emil Wetter ◽  
Jürgen Wegge ◽  
Klaus Jonas ◽  
Klaus-Helmut Schmidt

In most work contexts, several performance goals coexist, and conflicts between them and trade-offs can occur. Our paper is the first to contrast a dual goal for speed and accuracy with a single goal for speed on the same task. The Sternberg paradigm (Experiment 1, n = 57) and the d2 test (Experiment 2, n = 19) were used as performance tasks. Speed measures and errors revealed in both experiments that dual as well as single goals increase performance by enhancing memory scanning. However, the single speed goal triggered a speed-accuracy trade-off, favoring speed over accuracy, whereas this was not the case with the dual goal. In difficult trials, dual goals slowed down scanning processes again so that errors could be prevented. This new finding is particularly relevant for security domains, where both aspects have to be managed simultaneously.


Sign in / Sign up

Export Citation Format

Share Document