On the reconstruction of the shape of a seismic streamer

Geophysics ◽  
2010 ◽  
Vol 75 (1) ◽  
pp. H1-H6
Author(s):  
Bruno Goutorbe ◽  
Violaine Combier

In the frame of 3D seismic acquisition, reconstructing the shape of the streamer(s) for each shot is an essential step prior to data processing. Depending on the survey, several kinds of constraints help achieve this purpose: local azimuths given by compasses, absolute positions recorded by global positioning system (GPS) devices, and distances calculated between pairs of acoustic ranging devices. Most reconstruction methods are restricted to work on a particular type of constraint and do not estimate the final uncertainties. The generalized inversion formalism using the least-squares criterion can provide a robust framework to solve such a problem — handling several kinds of constraints together, not requiring an a priori parameterization of the streamer shape, naturally extending to any configuration of streamer(s), and giving rigorous uncertainties. We explicitly derive the equations governing the algorithm corresponding to a marine seismic survey using a single streamer with compasses distributed all along it and GPS devices located on the tail buoy and on the vessel. Reconstruction tests conducted on several synthetic examples show that the algorithm performs well, with a mean error of a few meters in realistic cases. The accuracy logically degrades if higher random errors are added to the synthetic data or if deformations of the streamer occur at a short length scale.

2009 ◽  
Vol 49 (2) ◽  
pp. 567
Author(s):  
Cameron Grebe ◽  
Luke Smith ◽  
Craig Reid

In September 2007 Woodside Energy Ltd, as operator of the Browse LNG Development , conducted the Maxima three-dimensional marine seismic survey (Maxima) at Scott Reef, a shelf-emergent coral atoll located in the Indian Ocean approximately 425 km north of Broome, Western Australia Implementation of the survey followed extensive state and Commonwealth environmental approvals processes that began more than 12 months earlier. The survey drew regulator and stakeholder attention, with focus on the uncertainties associated with predicting impacts on Scott Reef marine environment as a result of exposure to airgun noise emissions. Ministerial conditions for approval of the survey concluded that significant impacts were unlikely (Woodside 2008), but required Woodside to address the inherent uncertainties through implementation of a suite of research and monitoring activities prior to, during and after Maxima. Most were completed in on site Scott Reef as part of a field verification study, conducted in advance of the full data acquisition phase of Maxima. These monitoring studies showed that the actual effects of exposure to an airgun array were lower than predicted and established new sound exposure thresholds.


2014 ◽  
Vol 7 (3) ◽  
pp. 781-797 ◽  
Author(s):  
P. Paatero ◽  
S. Eberly ◽  
S. G. Brown ◽  
G. A. Norris

Abstract. The EPA PMF (Environmental Protection Agency positive matrix factorization) version 5.0 and the underlying multilinear engine-executable ME-2 contain three methods for estimating uncertainty in factor analytic models: classical bootstrap (BS), displacement of factor elements (DISP), and bootstrap enhanced by displacement of factor elements (BS-DISP). The goal of these methods is to capture the uncertainty of PMF analyses due to random errors and rotational ambiguity. It is shown that the three methods complement each other: depending on characteristics of the data set, one method may provide better results than the other two. Results are presented using synthetic data sets, including interpretation of diagnostics, and recommendations are given for parameters to report when documenting uncertainty estimates from EPA PMF or ME-2 applications.


2021 ◽  
Author(s):  
Yosuke Teranishi ◽  
Fumitoshi Murakami ◽  
Shinji Kawasaki ◽  
Motonori Higashinaka ◽  
Kei Konno ◽  
...  

2017 ◽  
Vol 17 (11) ◽  
pp. 6663-6678 ◽  
Author(s):  
Shreeya Verma ◽  
Julia Marshall ◽  
Mark Parrington ◽  
Anna Agustí-Panareda ◽  
Sebastien Massart ◽  
...  

Abstract. Airborne observations of greenhouse gases are a very useful reference for validation of satellite-based column-averaged dry air mole fraction data. However, since the aircraft data are available only up to about 9–13 km altitude, these profiles do not fully represent the depth of the atmosphere observed by satellites and therefore need to be extended synthetically into the stratosphere. In the near future, observations of CO2 and CH4 made from passenger aircraft are expected to be available through the In-Service Aircraft for a Global Observing System (IAGOS) project. In this study, we analyse three different data sources that are available for the stratospheric extension of aircraft profiles by comparing the error introduced by each of them into the total column and provide recommendations regarding the best approach. First, we analyse CH4 fields from two different models of atmospheric composition – the European Centre for Medium-Range Weather Forecasts (ECMWF) Integrated Forecasting System for Composition (C-IFS) and the TOMCAT/SLIMCAT 3-D chemical transport model. Secondly, we consider scenarios that simulate the effect of using CH4 climatologies such as those based on balloons or satellite limb soundings. Thirdly, we assess the impact of using a priori profiles used in the satellite retrievals for the stratospheric part of the total column. We find that the models considered in this study have a better estimation of the stratospheric CH4 as compared to the climatology-based data and the satellite a priori profiles. Both the C-IFS and TOMCAT models have a bias of about −9 ppb at the locations where tropospheric vertical profiles will be measured by IAGOS. The C-IFS model, however, has a lower random error (6.5 ppb) than TOMCAT (12.8 ppb). These values are well within the minimum desired accuracy and precision of satellite total column XCH4 retrievals (10 and 34 ppb, respectively). In comparison, the a priori profile from the University of Leicester Greenhouse Gases Observing Satellite (GOSAT) Proxy XCH4 retrieval and climatology-based data introduce larger random errors in the total column, being limited in spatial coverage and temporal variability. Furthermore, we find that the bias in the models varies with latitude and season. Therefore, applying appropriate bias correction to the model fields before using them for profile extension is expected to further decrease the error contributed by the stratospheric part of the profile to the total column.


2011 ◽  
Vol 2011 ◽  
pp. 1-13 ◽  
Author(s):  
Michael Ghijsen ◽  
Yuting Lin ◽  
Mitchell Hsing ◽  
Orhan Nalcioglu ◽  
Gultekin Gulsen

Diffuse Optical Tomography (DOT) is an optical imaging modality that has various clinical applications. However, the spatial resolution and quantitative accuracy of DOT is poor due to strong photon scatting in biological tissue. Structurala prioriinformation from another high spatial resolution imaging modality such as Magnetic Resonance Imaging (MRI) has been demonstrated to significantly improve DOT accuracy. In addition, a contrast agent can be used to obtain differential absorption images of the lesion by using dynamic contrast enhanced DOT (DCE-DOT). This produces a relative absorption map that consists of subtracting a reconstructed baseline image from reconstructed images in which optical contrast is included. In this study, we investigated and compared different reconstruction methods and analysis approaches for regular endogenous DOT and DCE-DOT with and without MR anatomicala prioriinformation for arbitrarily-shaped objects. Our phantom and animal studies have shown that superior image quality and higher accuracy can be achieved using DCE-DOT together with MR structurala prioriinformation. Hence, implementation of a combined MRI-DOT system to image ICG enhancement can potentially be a promising tool for breast cancer imaging.


2020 ◽  
Vol 14 (4) ◽  
pp. 640-652
Author(s):  
Abraham Gale ◽  
Amélie Marian

Ranking functions are commonly used to assist in decision-making in a wide variety of applications. As the general public realizes the significant societal impacts of the widespread use of algorithms in decision-making, there has been a push towards explainability and transparency in decision processes and results, as well as demands to justify the fairness of the processes. In this paper, we focus on providing metrics towards explainability and transparency of ranking functions, with a focus towards making the ranking process understandable, a priori , so that decision-makers can make informed choices when designing their ranking selection process. We propose transparent participation metrics to clarify the ranking process, by assessing the contribution of each parameter used in the ranking function in the creation of the final ranked outcome, using information about the ranking functions themselves, as well as observations of the underlying distributions of the parameter values involved in the ranking. To evaluate the outcome of the ranking process, we propose diversity and disparity metrics to measure how similar the selected objects are to each other, and to the underlying data distribution. We evaluate the behavior of our metrics on synthetic data, as well as on data and ranking functions on two real-world scenarios: high school admissions and decathlon scoring.


1976 ◽  
Vol 66 (1) ◽  
pp. 173-187
Author(s):  
Ray Buland

abstract A complete reexamination of Geiger's method in the light of modern numerical analysis indicates that numerical stability can be insured by use of the QR algorithm and the convergence domain considerably enlarged by the introduction of step-length damping. In order to make the maximum use of all data, the method is developed assuming a priori estimates of the statistics of the random errors at each station. Numerical experiments indicate that the bulk of the joint probability density of the location parameters is in the linear region allowing simple estimates of the standard errors of the parameters. The location parameters are found to be distributed as one minus chi squared with m degrees of freedom, where m is the number of parameters, allowing the simple construction of confidence levels. The use of the chi-squared test with n-m degrees of freedom, where n is the number of data, is introduced as a means of qualitatively evaluating the correctness of the earth model.


First Break ◽  
2005 ◽  
Vol 23 (6) ◽  
Author(s):  
M. Widmaier ◽  
A. Long ◽  
B. Danielsen ◽  
S. Hegna

Sign in / Sign up

Export Citation Format

Share Document