scholarly journals Understanding the Gap Between Information Models and Realism-Based Ontologies Using the Generic Component Model

2021 ◽  
Author(s):  
Mathias Brochhausen ◽  
Sarah J. Bost ◽  
Nitya Singh ◽  
Christoph Brochhausen ◽  
Bernd Blobel

The wide-spread use of Common Data Models and information models in biomedical informatics encourages assumptions that those models could provide the entirety of what is needed for knowledge representation purposes. Based on the lack of computable semantics in frequently used Common Data Models, there appears to be a gap between knowledge representation requirements and these models. In this use-case oriented approach, we explore how a system-theoretic, architecture-centric, ontology-based methodology can help to better understand this gap. We show how using the Generic Component Model helps to analyze the data management system in a way that allows accounting for data management procedures inside the system and knowledge representation of the real world at the same time.

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Luka Gradišar ◽  
Matevž Dolenc

An efficient database management system that supports the integration and interoperability of different information models is a foundation on which the higher levels of cyber-physical systems are built. In this paper, we address the problem of integrating monitoring data with building information models through the use of the graph data management system and the IFC standard (Industry Foundation Classes) to support the need for interoperability and collaborative work. The proposed workflow describes the conversion of IFC models into a graph database and the connection with data from sensors, which is then validated using the example of a bridge monitoring system. The presented IFC and sensor graph data models are structurally flexible and scalable to meet the challenges of smart cities and big data.


Author(s):  
William M. Marsden ◽  
Elizabeth R. Cope

Materials property data, and broader materials information, are essential to the wide range of people and functions in engineering enterprises involved in the design and construction of power generation facilities. This paper focuses on how this complex and specialist information must be managed, and how it can be deployed most effectively to those who need it. There are many different types of data to be considered, but they all begin as test results of one form or another. Generating this test data represents a major cost for many organizations. Applying it effectively can give them a major competitive advantage — for example, avoiding problems in design, gaining more understanding of the performance of materials, and ultimately reducing maintenance costs or enabling asset life extension. Yet few organizations have in place any systematic system to manage materials data. Not only does this mean that they are not making best use of their investment in this asset, it means that they can actually waste large amounts of time and money, in searching for the right data, or in duplicating tests that have already been done. This paper concentrates on the practical challenges involved, and on the technical details of how these challenges can be solved, drawing on the experience of developing the GRANTA MI materials information management system in collaboration with a consortium of leading aerospace, energy, and defense organizations. Three use-case scenarios are explored in which the management and application of materials data are important. These are: support for engineering design, statistical process control, and materials selection. For each use-case, the technical requirements for any corporate materials data management system are identified. Significant overlaps are found between the requirements for the different areas, indicating that a well-designed system can both meet a broad range of specific needs, and help to integrate different aspects of an organization’s operations. It is then appropriate to discuss the actual software tools required to meet these needs. These include tools to capture and consolidate materials data, to analyze and apply the data, to maintain a corporate materials information resource, and to deploy materials data enterprise-wide to the different functions (e.g., Design, and Quality Assurance) and roles (e.g., design engineers, stress analysts, and process improvement managers) that require it. The paper closes by providing guidelines on best practice regarding implementation of materials data management technology.


2003 ◽  
Vol 21 (1) ◽  
pp. 49-62 ◽  
Author(s):  
G. M. R. Manzella ◽  
E. Scoccimarro ◽  
N. Pinardi ◽  
M. Tonani

Abstract. A "ship of opportunity" program was launched as part of the Mediterranean Forecasting System Pilot Project. During the operational period (September 1999 to May 2000), six tracks covered the Mediterranean from the northern to southern boundaries approximately every 15 days, while a long eastwest track from Haifa to Gibraltar was covered approximately every month. XBT data were collected, sub-sampled at 15 inflection points and transmitted through a satellite communication system to a regional data centre. It was found that this data transmission system has limitations in terms of quality of the temperature profiles and quantity of data successfully transmitted. At the end of the MFSPP operational period, a new strategy for data transmission and management was developed. First of all, VOS-XBT data are transmitted with full resolution. Secondly, a new data management system, called Near Real Time Quality Control for XBT (NRT.QC.XBT), was defined to produce a parallel stream of high quality XBT data for further scientific analysis. The procedure includes: (1) Position control; (2) Elimination of spikes; (3) Re-sampling at a 1 metre vertical interval; (4) Filtering; (5) General malfunctioning check; (6) Comparison with climatology (and distance from this in terms of standard deviations); (7) Visual check; and (8) Data consistency check. The first six steps of the new procedure are completely automated; they are also performed using a new climatology developed as part of the project. The visual checks are finally done with a free-market software that allows NRT final data assessment. Key words. Oceanography: physical (instruments and techniques; general circulation; hydrography)


2017 ◽  
Vol 4 (1) ◽  
pp. 62-66
Author(s):  
Luyen Ha Nam

From long, long time ago until nowadays information still takes a serious position for all aspect of life, fromindividual to organization. In ABC company information is somewhat very sensitive, very important. But how wekeep our information safe, well we have many ways to do that: in hard drive, removable disc etc. with otherorganizations they even have data centre to save their information. The objective of information security is to keep information safe from unwanted access. We applied Risk Mitigation Action framework on our data management system and after several months we have a result far better than before we use it: information more secure, quickly detect incidents, improve internal and external collaboration etc.


2014 ◽  
Vol 36 (7) ◽  
pp. 1485-1499 ◽  
Author(s):  
Jie SONG ◽  
Tian-Tian LI ◽  
Zhi-Liang ZHU ◽  
Yu-Bin BAO ◽  
Ge YU

1991 ◽  
Author(s):  
Douglas E. Shackelford ◽  
John B. Smith ◽  
Joan Boone ◽  
Barry Elledge

2019 ◽  
Vol 14 (3) ◽  
pp. 160-172 ◽  
Author(s):  
Aynaz Nourani ◽  
Haleh Ayatollahi ◽  
Masoud Solaymani Dodaran

Background:Data management is an important, complex and multidimensional process in clinical trials. The execution of this process is very difficult and expensive without the use of information technology. A clinical data management system is software that is vastly used for managing the data generated in clinical trials. The objective of this study was to review the technical features of clinical trial data management systems.Methods:Related articles were identified by searching databases, such as Web of Science, Scopus, Science Direct, ProQuest, Ovid and PubMed. All of the research papers related to clinical data management systems which were published between 2007 and 2017 (n=19) were included in the study.Results:Most of the clinical data management systems were web-based systems developed based on the needs of a specific clinical trial in the shortest possible time. The SQL Server and MySQL databases were used in the development of the systems. These systems did not fully support the process of clinical data management. In addition, most of the systems lacked flexibility and extensibility for system development.Conclusion:It seems that most of the systems used in the research centers were weak in terms of supporting the process of data management and managing clinical trial's workflow. Therefore, more attention should be paid to design a more complete, usable, and high quality data management system for clinical trials. More studies are suggested to identify the features of the successful systems used in clinical trials.


Sign in / Sign up

Export Citation Format

Share Document