A proposed polarity standard for multicomponent seismic data

Geophysics ◽  
2002 ◽  
Vol 67 (4) ◽  
pp. 1028-1037 ◽  
Author(s):  
R. James Brown ◽  
Robert R. Stewart ◽  
Don C. Lawton

This paper proposes a multicomponent acquisition and preprocessing polarity standard that will apply generally to the three Cartesian geophone components and the hydrophone or microphone components of a 2‐D or 3‐D multicomponent survey on land, at the sea bottom, acquired as a vertical seismic profile, vertical‐cable, or marine streamer survey. We use a four‐component ocean‐bottom data set for purposes of illustration and example. A primary objective is a consistent system of polarity specifications to facilitate consistent horizon correlation among multicomponent data sets and enable determination of correct reflectivity polarity. The basis of this standard is the current SEG polarity standard, first enunciated as a field‐recording standard for vertical geophone data and hydrophone streamer data. It is founded on a right‐handed coordinate system: z positive downward; x positive in the forward line direction in a 2‐D survey, or a specified direction in a 3‐D survey, usually that of the receiver‐cable lines; and y positive in the direction 90° clockwise from x. The polarities of these axes determine the polarity of ground motion in any component direction (e.g., downward ground motion recording as positive values on the vertical‐geophone trace). According also to this SEG standard, a pressure decrease is to be recorded as positive output on the hydrophone trace. We also recommend a cyclic indexing convention, [W, X, Y, Z] or [0, 1, 2, 3], to denote hydrophone or microphone (pressure), inline (radial) geophone, crossline (transverse) geophone, and vertical geophone, respectively. We distinguish among three kinds of polarity standard: acquisition, preprocessing, and final‐display standards. The acquisition standard (summarized in the preceding paragraph) relates instrument output solely to sense of ground motion (geophones) and of pressure change (hydrophones). Polarity considerations beyond this [involving, e.g., source type, wave type (P or S), direction of arrival, anisotropy, tap‐test adjustments, etc.] fall under preprocessing polarity standards. We largely defer any consideration of a display standard.

2019 ◽  
Vol 219 (1) ◽  
pp. 66-79 ◽  
Author(s):  
H Simon ◽  
S Buske ◽  
P Hedin ◽  
C Juhlin ◽  
F Krauß ◽  
...  

SUMMARYA remarkably well preserved representation of a deeply eroded Palaeozoic orogen is found in the Scandinavian Caledonides, formed by the collision of the two palaeocontinents Baltica and Laurentia. Today, after 400 Ma of erosion along with uplift and extension during the opening of the North Atlantic Ocean, the geological structures in central western Sweden comprise far transported allochthonous units, the underlying Precambrian crystalline basement, and a shallow west-dipping décollement that separates the two and is associated with a thin layer of Cambrian black shales. These structures, in particular the Seve Nappes (upper part of the Middle Allochthons), the Lower Allochthons and the highly reflective basement are the target of the two approximately 2.5 km deep fully cored scientific boreholes in central Sweden that are part of the project COSC (Collisional Orogeny in the Scandinavian Caledonides). Thus, a continuous 5 km tectonostratigraphic profile through the Caledonian nappes into Baltica’s basement will be recovered. The first borehole, COSC-1, was successfully drilled in 2014 and revealed a thick section of the seismically highly reflective Lower Seve Nappe. The Seve Nappe Complex, mainly consisting of felsic gneisses and mafic amphibolites, appears to be highly anisotropic. To allow for extrapolation of findings from core analysis and downhole logging to the structures around the borehole, several surface and borehole seismic experiments were conducted. Here, we use three long offset surface seismic profiles that are centred on the borehole COSC-1 to image the structures in the vicinity of the borehole and below it. We applied Kirchhoff pre-stack depth migration, taking into account the seismic anisotropy in the Seve Nappe Complex. We calculated Green’s functions using an anisotropic eikonal solver for a VTI (transversely isotropic with vertical axis of symmetry) velocity model, which was previously derived by the analysis of VSP (Vertical Seismic Profile) and surface seismic data. We show, that the anisotropic results are superior to the corresponding isotropic depth migration. The reflections appear significantly more continuous and better focused. The depth imaging of the long offset profiles provides a link between a high-resolution 3-D data set and the regional scale 2-D COSC Seismic Profile and complements these data sets, especially in the deeper parts below the borehole. However, many of the reflective structures can be observed in the different data sets. Most of the dominant reflections imaged originate below the bottom of the borehole and are situated within the Precambrian basement or at the transition zones between Middle and Lower Allochthons and the basement. The origin of the deeper reflections remains enigmatic, possibly representing dolerite intrusions or deformation zones of Caledonian or pre-Caledonian age.


Geophysics ◽  
2016 ◽  
Vol 81 (3) ◽  
pp. V213-V225 ◽  
Author(s):  
Shaohuan Zu ◽  
Hui Zhou ◽  
Yangkang Chen ◽  
Shan Qu ◽  
Xiaofeng Zou ◽  
...  

We have designed a periodically varying code that can avoid the problem of the local coherency and make the interference distribute uniformly in a given range; hence, it was better at suppressing incoherent interference (blending noise) and preserving coherent useful signals compared with a random dithering code. We have also devised a new form of the iterative method to remove interference generated from the simultaneous source acquisition. In each iteration, we have estimated the interference using the blending operator following the proposed formula and then subtracted the interference from the pseudodeblended data. To further eliminate the incoherent interference and constrain the inversion, the data were then transformed to an auxiliary sparse domain for applying a thresholding operator. During the iterations, the threshold was decreased from the largest value to zero following an exponential function. The exponentially decreasing threshold aimed to gradually pass the deblended data to a more acceptable model subspace. Two numerically blended synthetic data sets and one numerically blended practical field data set from an ocean bottom cable were used to demonstrate the usefulness of our proposed method and the better performance of the periodically varying code over the traditional random dithering code.


2019 ◽  
Author(s):  
Attila Lengyel ◽  
David W. Roberts ◽  
Zoltán Botta-Dukát

AbstractAimsTo introduce REMOS, a new iterative reallocation method (with two variants) for vegetation classification, and to compare its performance with OPTSIL. We test (1) how effectively REMOS and OPTSIL maximize mean silhouette width and minimize the number of negative silhouette widths when run on classifications with different structure; (2) how these three methods differ in runtime with different sample sizes; and (3) if classifications by the three reallocation methods differ in the number of diagnostic species, a surrogate for interpretability.Study areaSimulation; example data sets from grasslands in Hungary and forests in Wyoming and Utah, USA.MethodsWe classified random subsets of simulated data with the flexible-beta algorithm for different values of beta. These classifications were subsequently optimized by REMOS and OPTSIL and compared for mean silhouette widths and proportion of negative silhouette widths. Then, we classified three vegetation data sets of different sizes from two to ten clusters, optimized them with the reallocation methods, and compared their runtimes, mean silhouette widths, numbers of negative silhouette widths, and the number of diagnostic species.ResultsIn terms of mean silhouette width, OPTSIL performed the best when the initial classifications already had high mean silhouette width. REMOS algorithms had slightly lower mean silhouette width than what was maximally achievable with OPTSIL but their efficiency was consistent across different initial classifications; thus REMOS was significantly superior to OPTSIL when the initial classification had low mean silhouette width. REMOS resulted in zero or a negligible number of negative silhouette widths across all classifications. OPTSIL performed similarly when the initial classification was effective but could not reach as low proportion of misclassified objects when the initial classification was inefficient. REMOS algorithms were typically more than an order of magnitude faster to calculate than OPTSIL. There was no clear difference between REMOS and OPTSIL in the number of diagnostic species.ConclusionsREMOS algorithms may be preferable to OPTSIL when (1) the primary objective is to reduce or eliminate negative silhouette widths in a classification, (2) the initial classification has low mean silhouette width, or (3) when the time efficiency of the algorithm is important because of the size of the data set or the high number of clusters.


2020 ◽  
Author(s):  
Mark Tamisiea ◽  
Benjamin Krichman ◽  
Himanshu Save ◽  
Srinivas Bettadpur ◽  
Zhigui Kang ◽  
...  

<p>To assess the quality of the CSR solutions, we compare results against external data sets that have contemporaneous availability.  These evaluations fall into three categories: changes in terrestrial water storage against data from the North American and Global Land Data Assimilation Systems, variations in ocean bottom pressure against data from the Deep Ocean Assessment of Tsunami Network, and estimates of the low degree and order Stokes coefficients compared against those inferred from satellite laser ranging observations (i.e. the CSR monthly 5x5 gravity harmonics from the MEaSUREs project).   As the mission provides a unique measurement of mass changes in the Earth system, evaluation of the new solutions against other data sets and models is challenging.  Thus, we primarily focus on relative agreement with these data set with the GRACE-FO solutions in relation to the historic agreement of the data sets with the GRACE solutions.</p>


2018 ◽  
Vol 2 ◽  
pp. e25929
Author(s):  
Christina Byrd

The Darwin Core data standard has rapidly become the go-to standard for biological and paleontological specimens. In order to accommodate all of the timescale data for paleontology specimens, standards for geologic age were developed and incorporated into Darwin Core. At the Sternberg Museum of Natural History (FHSM), digitization of the paleontology collection has been a primary objective. The adoption of the Darwin Core standard for FHSM’s paleontology data spurred the idea to use Darwin Core for the geology collection as well. There are currently no widely accepted data standards for geology specimens, but there are some organizations who have uploaded their data management standards online. Even though Darwin Core was developed for the dissemination of biological information, many of the data fields are applicable to geology. FHSM is working to adopt and adapt Darwin Core standards for its geology collection. FHSM currently has 84 fields to record geology data. Approximately sixty percent of these data fields directly correspond with Darwin Core terms and have been adopted with the corresponding data format. Seven percent of the fields correspond with Darwin Core terms but require adaptation by adding new shared language within the term. These fields include the classification of rocks and minerals and the addition of “geologicSpecimen” for the Darwin Core term “Basis Of Record”. Fortunately, minerals have a classification system that loosely resembles animal taxonomy. For example, quartz is a mineral species that is part of a group called Tectosilicates, which is subsequently grouped into Silicates. One quarter of the FHSM fields are specific to geology and do not fit within the current Darwin Core data set. When determining terminology for these fields, FHSM staff utilized the terms and standards set by the Open Geospatial Consortium (OGC), an international organization for making open standards for the global geospatial community. The terms adopted from the OGC come from a category called “EarthMaterial.” The remaining fields are specific to FHSM recordkeeping. In order to share these terms with others and hopefully start a larger conversation about data standards for this area of natural history, the terms and definitions will be made available on the FHSM website in the geology section. Using the same terms, formats, and overall standard across the disciplines at FHSM increases usability and uniformity of the different data sets, increases workflow efficiency, and simplifies development of the relational database for paleontological and geological specimens at FHSM.


Geophysics ◽  
2021 ◽  
pp. 1-56
Author(s):  
Flavio Poletto ◽  
Alex Goertz ◽  
Cinzia Bellezza ◽  
Endre Vange Bergfjord ◽  
Piero Corubolo ◽  
...  

Seismic while drilling (SWD) by drill-bit source has been successfully used in the past decades and is proven using variable configurations in onshore applications. The method creates a reverse vertical seismic profile (RVSP) dataset from surface sensors deployed as arrays in the proximity of the monitored wells. The typical application makes use of rig-pilot reference (pilot) sensors at the top of the drill-string and also downhole. This approach provides while-drilling checkshots as well as multioffset RVSP for 2-D and 3-D imaging around the well and prediction ahead of the bit. For logistical (sensor deployment) and cost (rig time related to technical installation) reasons the conventional drill-bit SWD application is typically much easier onshore than offshore. We present a novel approach that uses a network of passive-monitoring sea bottom nodes pre-deployed for microseismic monitoring to simultaneously and effectively record offshore SWD data. We study the results of a pilot test where we passively monitored the drilling of an appraisal well at the Wisting discovery in the Barents Sea with an ocean-bottom cable deployed temporarily around the drilling rig. The continuous passive recording of vibration signals emitted during the drilling of the well provides the SWD data set, which is treated as a reverse vertical seismic profile. The study is performed without rig-pilot signal. The results are compared with legacy data and demonstrate the effectiveness of the approach and point to future applications for real-time monitoring of the drilling progress, both in terms of geosteering the drill bit and predicting formation properties ahead of the bit by reflection imaging.


2019 ◽  
Vol 35 (4) ◽  
pp. 1637-1661 ◽  
Author(s):  
Xavier Bellagamba ◽  
Robin Lee ◽  
Brendon A. Bradley

The ambitious scopes of recent earthquake ground motion studies are generating a need for more high-quality ground motion records. As the number of deployed sensors is rapidly growing through improved accessibility and cost (e.g., ground motion stations, low-cost accelerometers, smart phones), an exponentially increasing amount of data are being generated. Previously, quality-assured ground motion data sets for engineering applications were generated using both manual and automated quality screening methodologies. More recently, new techniques have emerged that potentially offer both improved classification accuracy and computational expediency. This work presents a machine learning–oriented method to facilitate and accelerate the quality classification of ground motion records from small magnitude earthquakes. Feedforward neural networks are selected for their ability to efficiently recognize patterns and are trained on two New Zealand data sets. An application to physics-based ground motion simulation validation indicates that the proposed approach delivers results that are comparable with manual quality selection. Robust automatic ground motion quality screening allows a significant increase in data set size for development, calibration, and validation of ground motion models.


Geophysics ◽  
2010 ◽  
Vol 75 (1) ◽  
pp. Q11-Q20 ◽  
Author(s):  
R. James Brown

In four-component (4-C) towed ocean-bottom-cable (OBC) data sets, acquisition footprints are often observed. Sometimes these exhibit a spatial period equal to the length of the receiver cable. I have analyzed a 2D 4-C OBC data set, looking at common-offset gathers (COG), spectral analyses, and hodogram analyses of the direct P-wave first breaks. The acquisition footprint is seen to be directly related to the following effects observed on a few of the multicomponent receivers, namely, those nearest to the towing vessel: significant delays on the inline component though not on the downgoing direct-P first breaks; depletion of higher frequencies (narrower bandwidth) on the inline component; and oscillatory motion closer to the vertical on the direct-P first breaks equivalent to decreased amplitude on the in-line component. This is interpreted to be a result of the towing procedure wherein the leading end of the cable, with the first few receiver modules, is raised from the seafloor and laid down again, relatively lightly, on top of seafloor material that might be poorly consolidated, while the trailing receivers are pulled through and down into this material. For these leading receiver modules, this results in poor inline horizontal coupling (i.e., slipping) and delayed P-S onsets due to their vertically higher positions (relative to the trailing receivers) and quite high near-seafloor [Formula: see text] ratios. To rectify this problem in future acquisition, a longer lead-in cable should prevent lifting of the leading receivers and allow all of them to couple with the seafloor in the same way. For data already acquired with an acquisition footprint on the inline component, a two-step process involving surface-consistent deconvolution or trace equalization and static correction is proposed.


2018 ◽  
Vol 154 (2) ◽  
pp. 149-155
Author(s):  
Michael Archer

1. Yearly records of worker Vespula germanica (Fabricius) taken in suction traps at Silwood Park (28 years) and at Rothamsted Research (39 years) are examined. 2. Using the autocorrelation function (ACF), a significant negative 1-year lag followed by a lesser non-significant positive 2-year lag was found in all, or parts of, each data set, indicating an underlying population dynamic of a 2-year cycle with a damped waveform. 3. The minimum number of years before the 2-year cycle with damped waveform was shown varied between 17 and 26, or was not found in some data sets. 4. Ecological factors delaying or preventing the occurrence of the 2-year cycle are considered.


2018 ◽  
Vol 21 (2) ◽  
pp. 117-124 ◽  
Author(s):  
Bakhtyar Sepehri ◽  
Nematollah Omidikia ◽  
Mohsen Kompany-Zareh ◽  
Raouf Ghavami

Aims & Scope: In this research, 8 variable selection approaches were used to investigate the effect of variable selection on the predictive power and stability of CoMFA models. Materials & Methods: Three data sets including 36 EPAC antagonists, 79 CD38 inhibitors and 57 ATAD2 bromodomain inhibitors were modelled by CoMFA. First of all, for all three data sets, CoMFA models with all CoMFA descriptors were created then by applying each variable selection method a new CoMFA model was developed so for each data set, 9 CoMFA models were built. Obtained results show noisy and uninformative variables affect CoMFA results. Based on created models, applying 5 variable selection approaches including FFD, SRD-FFD, IVE-PLS, SRD-UVEPLS and SPA-jackknife increases the predictive power and stability of CoMFA models significantly. Result & Conclusion: Among them, SPA-jackknife removes most of the variables while FFD retains most of them. FFD and IVE-PLS are time consuming process while SRD-FFD and SRD-UVE-PLS run need to few seconds. Also applying FFD, SRD-FFD, IVE-PLS, SRD-UVE-PLS protect CoMFA countor maps information for both fields.


Sign in / Sign up

Export Citation Format

Share Document