Geotechnical Data Management System of TRANSPETRO Pipe Routes: GeoRisco

Author(s):  
Renato Seixas da Rocha ◽  
A´lvaro Maia da Costa ◽  
Cla´udio dos Santos Amaral ◽  
Ana Lu´cia Lodi da Cruz ◽  
Anderson Oliveira Soares ◽  
...  

The integrity management of an extensive pipeline mesh requests the acquisition, transmission, storage, analysis and control of a great amount of data. Guarantee the quality of the data in each one of those stages is of extreme importance in the execution of an effective decision process. Among several fields or disciplines that are related to the structural integrity of on-shore pipelines, the Geotechnics is receiving the right importance. More and more a buried pipeline has been considered as a geotechnical work, once the interaction of the foundation soil with the pipes has been demonstrating to be an important phenomenon to be monitored, with the objective of maintaining acceptable levels of strain and stress in the pipe. Therefore, the whole process of reading the geotechnical instrumentation installed in the piperoute domain has a great impact on the management of the pipeline integrity. As examples of that instrumentation we have: inclinometers, piezometers, strain-gages, etc. This work presents the information system and the methodology that were implanted by TRANSPETRO in the data management of its geotechnical instrumentation and that was named GeoRisco System. The GeoRisco System almost embraces all the stages of the monitoring process where geotechnical instrumentation exists. The monitoring process begins with the standardization of the readings supply. For this purpose, a program was created to generate XML files that facilitate the transmission and sharing of data on the Web. To store the readings of the field instrumentation it was implemented a database that centralizes these data and can be accessed in all the units of the Company. In order to facilitate for the technicians the analysis of these data, a computer program was created, where several types of graphs and spreadsheets were implemented for each instrument type. Finally, a program was developed to alert for critical changes in the measured variables that were selected to control the pipelines safety.

2018 ◽  
Author(s):  
Glenda M. Yenni ◽  
Erica M. Christensen ◽  
Ellen K. Bledsoe ◽  
Sarah R. Supp ◽  
Renata M. Diaz ◽  
...  

AbstractData management and publication are core components of the research process. An emerging challenge that has received limited attention in biology is managing, working with, and providing access to data under continual active collection. “Evolving data” present unique challenges in quality assurance and control, data publication, archiving, and reproducibility. We developed a evolving data workflow for a long-term ecological study that addresses many of the challenges associated with managing this type of data. We do this by leveraging existing tools to: 1) perform quality assurance and control; 2) import, restructure, version, and archive data; 3) rapidly publish new data in ways that ensure appropriate credit to all contributors; and 4) automate most steps in the data pipeline to reduce the time and effort required by researchers. The workflow uses two tools from software development, version control and continuous integration, to create a modern data management system that automates the pipeline.


2013 ◽  
Vol 391 ◽  
pp. 603-606
Author(s):  
Jun Fang Li ◽  
Peng Zhang ◽  
Qiang Gao ◽  
Sha Sha Chen

Aiming at the problems of data messy, data unshared and thus the process control unsmooth and management efficiency in the work of crane, a wireless data acquisition system was designed based on the industrial wireless transmission equipment Nissei ND250 digital radio. To design the weighing data management software using Visual Basic6.0 combined with database, and to connect the digital radio by using MSComm, this system can realize the seamless connection of each substation and control center. In addition, to collect the various parameters into compute, it can complete the data analysis and storage. Nowadays, there is no example of the application in crane using digital radio. This method is very important to develop the technology in other areas.


Author(s):  
Ed Wiegele ◽  
David Nemeth ◽  
Shahani Kariyawasam ◽  
Stuart Clouston

Within most pipeline organizations, maintenance and other facility departments use a range of separate data sources and applications to manage the integrity, maintenance and safety of their pipelines. These databases represent a significant investment over many years and are an integral part of day-to-day operations. It is evident that integration of data into a single, coherent data management system can provide significant benefits. However, the cost of implementing entirely new systems — with intensive data capture programs — is difficult to justify given the earlier investments. As a result, dedicated risk management software using static and separately maintained data is often used as a quick, low cost alternative to meet regulatory compliance commitments. Experience has shown that, with the right technology and an understanding of the specific needs of an organisation, a phased approach to integrated data management can be achieved at minimum initial cost by exploiting legacy data. This provides a low cost yet scalable solution that can grow with the changing needs of the business. In addition to the benefits of legacy data integration, this paper presents an insight into the additional benefits of technologies for distributed data access to provide simple, process-focussed reporting tools. The key role of data management in risk assessment and consequent integrity decision support process is discussed.


Author(s):  
Chet Wood

Abstract An Object Database Management System (ODMS) can be a very useful component when developing applications for use in engineering and manufacturing. Choosing the right product requires a thorough analysis of the data requirements of one’s application, and an equally thorough study of the characteristics of the vendor products. Over a period of about two years, data was gathered on the products of object oriented database manufacturers and researchers. As an example of how to analyze database requirements, an overview of the requirements of our application is presented, followed by a tutorial on the elements and features provided by an ODMS. A brief description is given of each of about a dozen products. Finally, there are tables comparing specific features of a number of these systems.


Electronics ◽  
2021 ◽  
Vol 10 (23) ◽  
pp. 3019
Author(s):  
Young-Hoon Park ◽  
Yejin Kim ◽  
Junho Shim

The advances made in genome technology have resulted in significant amounts of genomic data being generated at an increasing speed. As genomic data contain various privacy-sensitive information, security schemes that protect confidentiality and control access are essential. Many security techniques have been proposed to safeguard healthcare data. However, these techniques are inadequate for genomic data management because of their large size. Additionally, privacy problems due to the sharing of gene data are yet to be addressed. In this study, we propose a secure genomic data management system using blockchain and local differential privacy (LDP). The proposed system employs two types of storage: private storage for internal staff and semi-private storage for external users. In private storage, because encrypted gene data are stored, only internal employees can access the data. Meanwhile, in semi-private storage, gene data are irreversibly modified by LDP. Through LDP, different noises are added to each section of the genomic data. Therefore, even though the third party uses or exposes the shared data, the owner’s privacy is guaranteed. Furthermore, the access control for each storage is ensured by the blockchain, and the gene owner can trace the usage and sharing status using a decentralized application in a mobile device.


Sensi Journal ◽  
2020 ◽  
Vol 6 (2) ◽  
pp. 236-246
Author(s):  
Ilamsyah Ilamsyah ◽  
Yulianto Yulianto ◽  
Tri Vita Febriani

The right and appropriate system of receiving and transferring goods is needed by the company. In the process of receiving and transferring goods from the central warehouse to the branch warehouse at PDAM Tirta Kerta Raharja, Tangerang Regency, which is currently done manually is still ineffective and inaccurate because the Head of Subdivision uses receipt documents, namely PPBP and mutation of goods, namely MPPW in the form of paper as a submission media. The Head of Subdivision enters the data of receipt and mutation of goods manually and requires a relatively long time because at the time of demand for the transfer of goods the Head of Subdivision must check the inventory of goods in the central warehouse first. Therefore, it is necessary to hold a design of information systems for the receipt and transfer of goods from the central warehouse to a web-based branch warehouse that is already database so that it is more effective, efficient and accurate. With the web-based system of receiving and transferring goods that are already datatabed, it can facilitate the Head of Subdivision in inputing data on the receipt and transfer of goods and control of stock inventory so that the Sub Head of Subdivision can do it periodically to make it more effective, efficient and accurate. The method of data collection is done by observing, interviewing and studying literature from various previous studies, while the system analysis method uses the Waterfall method which aims to solve a problem and uses design methods with visual modeling that is object oriented with UML while programming using PHP and MySQL as a database.


2017 ◽  
Vol 4 (1) ◽  
pp. 62-66
Author(s):  
Luyen Ha Nam

From long, long time ago until nowadays information still takes a serious position for all aspect of life, fromindividual to organization. In ABC company information is somewhat very sensitive, very important. But how wekeep our information safe, well we have many ways to do that: in hard drive, removable disc etc. with otherorganizations they even have data centre to save their information. The objective of information security is to keep information safe from unwanted access. We applied Risk Mitigation Action framework on our data management system and after several months we have a result far better than before we use it: information more secure, quickly detect incidents, improve internal and external collaboration etc.


Sign in / Sign up

Export Citation Format

Share Document