scholarly journals obspyDMT: A Python Toolbox for Retrieving and Processing of Large Seismological Datasets

2017 ◽  
Author(s):  
Kasra Hosseini ◽  
Karin Sigloch

Abstract. We present obspyDMT, a free, open source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous, and/or dynamically growing ones. obspyDMT simplifies and speeds up user-interaction with data centres, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centres and data exchange protocols, and is provided with powerful diagnostic and plotting tools to check the retrieved data and meta-data. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command line tool (requiring no knowledge of Python) or can be integrated as a module with other Python codes. It facilitates data archival, pre-processing, instrument correction, and quality control -- routine but non-trivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation, and retrieval of synthetic seismograms from IRIS DMC's Syngine webservice.

Solid Earth ◽  
2017 ◽  
Vol 8 (5) ◽  
pp. 1047-1070 ◽  
Author(s):  
Kasra Hosseini ◽  
Karin Sigloch

Abstract. We present obspyDMT, a free, open-source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous and/or dynamically growing ones. ObspyDMT simplifies and speeds up user interaction with data centers, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centers and data exchange protocols and is provided with powerful diagnostic and plotting tools to check the retrieved data and metadata. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command-line tool (requiring no knowledge of Python) or can be integrated as a module with other Python codes. It facilitates data archiving, preprocessing, instrument correction and quality control – routine but nontrivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation and retrieval of synthetic seismograms from the Synthetics Engine (Syngine) web service of the Data Management Center (DMC) at the Incorporated Research Institutions for Seismology (IRIS).


2020 ◽  
Author(s):  
Sebastian Heimann ◽  
Marius Kriegerowski ◽  
Marius Isken ◽  
Hannes Vasyura-Bathke ◽  
Simone Cesca ◽  
...  

<p>Pyrocko is an open source seismology toolbox and library, written in the Python programming language. It can be utilized flexibly for a variety of geophysical tasks, like seismological data processing and analysis, modelling of waveforms, InSAR or GPS displacement data, or for seismic source characterization. At its core, Pyrocko is a  library  and  framework  providing  building  blocks  for researchers  and  students  wishing  to  develop  their  own applications. Pyrocko contains a few standalone applications for everyday seismological practice. These include the Snuffler program, an extensible seismogram browser and workbench, the Cake tool, providing travel-time and ray-path computations for 1D layered earthmodels, Fomosto, a tool to manage pre-calculated Green’s function stores, Jackseis, a command-line tool for common waveform archive data manipulations, Colosseo, a tool to create synthetic earthquake scenarios, serving waveforms and static displacements, and new, Sparrow, a 3D geophysical data visualization tool. This poster gives a glimpse of Pyrocko’s features, for more examples and tutorials visit https://pyrocko.org/.</p>


1999 ◽  
Vol 39 (4) ◽  
pp. 193-201
Author(s):  
P. J. A. Gijsbers

The need for integrated analysis poses a request for integration of computer models, paying extra attention to interfaces, data management and user interaction. Sector wide standardization using data dictionaries and data exchange formats can be a great help in streamlining data exchange. However, this type of standardization can have some drawbacks for a generic framework for model integration. Another concept, called Model Data Dictionary (MDD), has been developed as an alternative for proper data management. The concept is a variant on the federated database concept, a concept where local databases maintain their autonomy, while an interconnection database provides a link for sharing data. The MDD is based on a highly generic data model for geographic referenced objects, which if needed facilitates mapping of the sector wide data dictionary. External interfaces provide, in combination with a data format mapping component, a link to SQL-based data sources and model specific databases. A generic Object Data Editor (ODE), linked to the MDD, has been proposed for provision of a common data editing facility for mathematical models. A test version of the combined MDD/ODE-concept has shown the applicability for integration of all kinds of geographic object oriented mathematical models (both simulation and optimization).


2011 ◽  
Vol 16 (9) ◽  
pp. 1059-1067 ◽  
Author(s):  
Peter Horvath ◽  
Thomas Wild ◽  
Ulrike Kutay ◽  
Gabor Csucs

Imaging-based high-content screens often rely on single cell-based evaluation of phenotypes in large data sets of microscopic images. Traditionally, these screens are analyzed by extracting a few image-related parameters and use their ratios (linear single or multiparametric separation) to classify the cells into various phenotypic classes. In this study, the authors show how machine learning–based classification of individual cells outperforms those classical ratio-based techniques. Using fluorescent intensity and morphological and texture features, they evaluated how the performance of data analysis increases with increasing feature numbers. Their findings are based on a case study involving an siRNA screen monitoring nucleoplasmic and nucleolar accumulation of a fluorescently tagged reporter protein. For the analysis, they developed a complete analysis workflow incorporating image segmentation, feature extraction, cell classification, hit detection, and visualization of the results. For the classification task, the authors have established a new graphical framework, the Advanced Cell Classifier, which provides a very accurate high-content screen analysis with minimal user interaction, offering access to a variety of advanced machine learning methods.


2012 ◽  
Vol 4 (4) ◽  
pp. 15-30 ◽  
Author(s):  
John Haggerty ◽  
Mark C. Casson ◽  
Sheryllynne Haggerty ◽  
Mark J. Taylor

The increasing use of social media, applications or platforms that allow users to interact online, ensures that this environment will provide a useful source of evidence for the forensics examiner. Current tools for the examination of digital evidence find this data problematic as they are not designed for the collection and analysis of online data. Therefore, this paper presents a framework for the forensic analysis of user interaction with social media. In particular, it presents an inter-disciplinary approach for the quantitative analysis of user engagement to identify relational and temporal dimensions of evidence relevant to an investigation. This framework enables the analysis of large data sets from which a (much smaller) group of individuals of interest can be identified. In this way, it may be used to support the identification of individuals who might be ‘instigators’ of a criminal event orchestrated via social media, or a means of potentially identifying those who might be involved in the ‘peaks’ of activity. In order to demonstrate the applicability of the framework, this paper applies it to a case study of actors posting to a social media Web site.


2006 ◽  
Vol 78 (3) ◽  
pp. 613-631 ◽  
Author(s):  
Richard Cammack ◽  
Yang Fann ◽  
Robert J. Lancashire ◽  
John P. Maher ◽  
Peter S. McIntyre ◽  
...  

In this document, we define a data exchange format initially formulated from discussions of an International Union of Pure and Applied Chemistry (IUPAC) limited-term task group at the 35th Royal Society of Chemistry-ESR conference in Aberdeen 2002. The definition of this format is based on the IUPAC Joint Committee on Atomic and Molecular Physical Data Exchange (JCAMPDX) protocols, which were developed for the exchange of infrared spectra and extended to chemical structures, nuclear magnetic resonance data, mass spectra, and ion mobility spectra. This standard of the JCAMP-DX was further extended to cover year 2000 compatible date strings and good laboratory practice, and the next release will cover the information needed for storing n-dimensional data sets. The aim of this paper is to adapt JCAMP-DX to the special requirements for electron magnetic resonance (EMR).


Science ◽  
1989 ◽  
Vol 246 (4933) ◽  
pp. 984-984
Author(s):  
J. R. FILSON ◽  
J. PETERSON

2015 ◽  
Vol 15 (2) ◽  
pp. 335-347 ◽  
Author(s):  
A. C. Aydinoglu ◽  
M. S. Bilgin

Abstract. Disaster management aims to reduce catastrophic losses of disasters. Geographic information technologies support disaster management activities for effective and collaborative data management considering the complex nature of disasters. This study with an original conceptual approach aims to develop interoperable geographic data model and analysis tools to manage geographic data sets coming from different sources. For landslide disaster, 39 scenario-based activities were analysed with the required data according to user needs in a cycle of activities at mitigation, preparedness, response, and recovery phases. An interoperable geographic data model for disaster management (ADYS), enabling up-to-date exchange of geographic data, was designed, compliant with the standards of ISO/TC211 Geographic Information/Geomatics, Open Geospatial Consortium (OGC), and the Turkish National GIS (TUCBS). An open source and free analysis toolbox was developed and tested in the case study of activities such as landslide hazard analysis and a disaster warning system to support the Provincial Disaster Management Centres of Turkey. Open data models and analysis tools make effective activity management and data sharing possible. However, transforming data sets into data exchange formats is laborious.


Sign in / Sign up

Export Citation Format

Share Document