scholarly journals Open Sensor Manager for IIoT

2020 ◽  
Vol 9 (2) ◽  
pp. 30 ◽  
Author(s):  
Riku Ala-Laurinaho ◽  
Juuso Autiosalo ◽  
Kari Tammi

Data collection in an industrial environment enables several benefits: processes and machinery can be monitored; the performance can be optimized; and the machinery can be proactively maintained. To collect data from machines or production lines, numerous sensors are required, which necessitates a management system. The management of constrained IoT devices such as sensor nodes is extensively studied. However, the previous studies focused only on the remote software updating or configuration of sensor nodes. This paper presents a holistic Open Sensor Manager (OSEMA), which addresses also generating software for different sensor models based on the configuration. In addition, it offers a user-friendly web interface, as well as a REST API (Representational State Transfer Application Programming Interface) for the management. The manager is built with the Django web framework, and sensor nodes rely on ESP32-based microcontrollers. OSEMA enables secure remote software updates of sensor nodes via encryption and hash-based message authentication code. The collected data can be transmitted using the Hypertext Transfer Protocol (HTTP) and Message Queuing Telemetry Transport (MQTT). The use of OSEMA is demonstrated in an industrial domain with applications estimating the usage roughness of an overhead crane and tracking its location. OSEMA enables retrofitting different sensors to existing machinery and processes, allowing additional data collection.

Author(s):  
Adian Fatchur Rochim ◽  
Abda Rafi ◽  
Adnan Fauzi ◽  
Kurniawan Teguh Martono

The use of information technology these days are very high. From business through education activities tend to use this technology most of the time. Information technology uses computer networks for integration and management data. To avoid business problems, the number of network devices installed requires a manageable network configuration for easier maintenance. Traditionally, each of network devices has to be manually configured by network administrators. This process takes time and inefficient. Network automation methods exist to overcome the repetitive process. Design model uses a web-based application for maintenance and automates networking tasks. In this research, the network automation system implemented and built a controller application that used REST API (Representational State Transfer Application Programming Interface) architecture and built by Django framework with Python programming language. The design modeled namely As-RaD System. The network devices used in this research are Cisco CSR1000V because it supports REST API communication to manage its network configuration and could be placed on the server either. The As-RaD System provides 75% faster performance than Paramiko and 92% than NAPALM.


2020 ◽  
Vol 9 (4) ◽  
pp. 394-402
Author(s):  
Helmy ◽  
Athadhia Febyana ◽  
Agung Al Rasyid ◽  
Arif Nursyahid ◽  
Thomas Agung Setyawan ◽  
...  

Akuaponik merupakan penggabungan antara akuakultur dengan hidroponik. Salah satu sistem hidroponik yaitu sistem drip (tetes). Parameter yang perlu diperhatikan dalam budidaya akuaponik antara lain keasaman larutan nutrisi yaitu pH, suhu air, dan larutan nutrisi yang ditunjukkan oleh kepekatan zat padat terlarut dalam air (Total Dissolved Solids, TDS). Nutrisi tanaman diperoleh dari kotoran ikan yang mengandung nitrogen. Oleh karena itu, diperlukan pemantauan pH, TDS, dan suhu secara realtime dan pengendalian kelembapan tanah pada tanaman akuaponik agar tanaman tidak kekurangan nutrisi. Proses pengendalian menggunakan Representational State Transfer Application Programming Interface (REST API) dalam menerima nilai batas ambang yang ditentukan petani akuaponik melalui situs web dan mengirimkan nilai kelembapan tanah dan parameter kolam ikan berupa pH, suhu dan TDS ke server. Pengujian data loss dan delay pada sistem pemantauan dan pengendalian ini diperlukan untuk mengetahui keandalan alat dalam pengiriman dan penerimaan data. Selain itu, diperlukan notifikasi berupa e-mail kepada petani apabila nilai kelembapan tanah kurang dari batas ambang. Hasil pengujian menunjukkan sistem dapat mengirimkan notifikasi berupa e-mail kepada petani apabila nilai kelembapan tanah kurang dari batas ambang, rerata delay pemantauan node-gateway sebesar 6,01 detik, sedangkan rerata delay pemantauan gateway–server sebesar 10,02 detik, dan rerata delay pengendalian server–gateway sebesar 92,55 detik.


2015 ◽  
Vol 87 (11-12) ◽  
pp. 1127-1137
Author(s):  
Stuart J. Chalk

AbstractThis paper details an approach to re-purposing scientific data as presented on a web page for the sole purpose of making the data more available for searching and integration into other websites. Data ‘scraping’ is used to extract metadata from a set of pages on the National Institute of Standards and Technology (NIST) website, clean, organize and store the metadata in a MySQL database. The metadata is then used to create a new website at the authors institution using the CakePHP framework to create a representational state transfer (REST) style application program interface (API). The processes used for website analysis, schema development, database construction, metadata scraping, REST API development, and remote data integration are discussed. Lessons learned and tips and tricks on how to get the most out of the process are also included.


2021 ◽  
Vol 8 (2) ◽  
pp. 180-185
Author(s):  
Anna Tolwinska

This article aims to explain the key metadata elements listed in Participation Reports, why it’s important to check them regularly, and how Crossref members can improve their scores. Crossref members register a lot of metadata in Crossref. That metadata is machine-readable, standardized, and then shared across discovery services and author tools. This is important because richer metadata makes content more discoverable and useful to the scholarly community. It’s not always easy to know what metadata Crossref members register in Crossref. This is why Crossref created an easy-to-use tool called Participation Reports to show editors, and researchers the key metadata elements Crossref members register to make their content more useful. The key metadata elements include references and whether they are set to open, ORCID iDs, funding information, Crossmark metadata, licenses, full-text URLs for text-mining, and Similarity Check indexing, as well as abstracts. ROR IDs (Research Organization Registry Identifiers), that identify institutions will be added in the future. This data was always available through the Crossref ’s REST API (Representational State Transfer Application Programming Interface) but is now visualized in Participation Reports. To improve scores, editors should encourage authors to submit ORCIDs in their manuscripts and publishers should register as much metadata as possible to help drive research further.


2021 ◽  
Author(s):  
Zohar Naor

Abstract This study suggests using a user-initiated detecting and data gathering from power-limited and even passive wireless devices, such as passive RFID tags, wireless sensor networks (WSNs), and Internet of Things (IoT) devices, that either power limitation or poor cellular coverage prevents them from communicating directly with wireless networks. While previous studies focused on sensors that continuously transmit their data, the focus of this study is on passive devices. The key idea is that instead of receiving the data transmitted by the sensor nodes, an external device (a reader), such as an unnamed aerial vehicle (UAV), or a smartphone is used to detect IoT devices and read the data stored in the sensor nodes, and then to deliver it to the cloud, in which it is stored and processed. While previous studies on UAV-aided data collection from WSNs focused on the UAV path planning, the focus of this study is on the rate at which the passive sensor nodes should be polled. That is, to find the minimal monitoring rate that still guarantees accurate and reliable data collection. The proposed scheme enables us to deploy wireless sensor networks over a large geographic area (e.g., for agricultural applications), in which the cellular coverage is very poor if any. Furthermore, the usage of initiated data collection can enable the deployment of passive WSNs. Thus, can significantly reduce both the operational cost, as well as the deployment cost, of the WSN.


Sensors ◽  
2020 ◽  
Vol 20 (19) ◽  
pp. 5654
Author(s):  
Moonseong Kim ◽  
Sooyeon Park ◽  
Woochan Lee

With the growing interest in big data technology, mobile IoT devices play an essential role in data collection. Generally, IoT sensor nodes are randomly distributed to areas where data cannot be easily collected. Subsequently, when data collection is impossible (i.e., sensing holes occurrence situation) due to improper placement of sensors or energy exhaustion of sensors, the sensors should be relocated. The cluster header in the sensing hole sends requests to neighboring cluster headers for the sensors to be relocated. However, it can be possible that sensors in the specific cluster zones near the sensing hole are continuously requested to move. With this knowledge, there can be a ping-pong problem, where the cluster headers in the neighboring sensing holes repeatedly request the movement of the sensors in the counterpart sensing hole. In this paper, we first proposed the near-uniform selection and movement scheme of the sensors to be relocated. By this scheme, the energy consumption of the sensors can be equalized, and the sensing capability can be extended. Thus the network lifetime can be extended. Next, the proposed relocation protocol resolves a ping-pong problem using queues with request scheduling. Another crucial contribution of this paper is that performance was analyzed using the fully-customed OMNeT++ simulator to reflect actual environmental conditions, not under over-simplified artificial network conditions. The proposed relocation protocol demonstrates a uniform and energy-efficient movement with ping-pong free capability.


2015 ◽  
Vol 6 ◽  
pp. 1609-1634 ◽  
Author(s):  
Nina Jeliazkova ◽  
Charalampos Chomenidis ◽  
Philip Doganis ◽  
Bengt Fadeel ◽  
Roland Grafström ◽  
...  

Background: The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs). Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs. Results: The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API), and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms. Conclusion: We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the “representational state transfer” (REST) API enables building user friendly interfaces and graphical summaries of the data, and how these resources facilitate the modelling of reproducible quantitative structure–activity relationships for nanomaterials (NanoQSAR).


2021 ◽  
Vol 2021 ◽  
pp. 1-19
Author(s):  
Wenquan Jin ◽  
Rongxu Xu ◽  
Sunhwan Lim ◽  
Dong-Hwan Park ◽  
Chanwon Park ◽  
...  

The Internet of Things (IoT) enables the number of connected devices to be increased rapidly based on heterogeneous technologies such as platforms, frameworks, libraries, protocols, and standard specifications. Based on the connected devices, various applications can be developed by integrating domain-specific contents using the service composition for providing improved services. The management of the information including devices, contents, and composite objects is necessary to represent the physical objects on the Internet for accessing the IoT services transparently. In this paper, we propose an integrated service composition approach based on multiple service providers to provide improved IoT services by combining various service objects in heterogeneous IoT networks. In the proposed IoT architecture, each service provider provides web services based on Representational State Transfer (REST) Application Programming Interface (API) that delivers information to the clients as well as other providers for integrating the information to provide new services. Through the REST APIs, the integration management provider combines the service result of the IoT service provider to other contents to provide improved services. Moreover, the interworking proxy is proposed to bridge heterogeneous IoT networks for enabling transparent access in the integrated services through proving protocol translating on the entry of the device networks. Therefore, the interworking proxy is deployed between the IoT service provider and device networks to enable clients to access heterogeneous IoT devices through the composited services transparently.


Competitive ◽  
2021 ◽  
Vol 16 (2) ◽  
pp. 87-94
Author(s):  
Supono Syafiq ◽  
Sari Armiati

Saat ini data merupakan bagian yang penting di era transformasi teknologi informasi, proses komunikasi pun tidak dibatasi oleh perbedaan jenis perangkat yang digunakan membuat informasi dapat diakses degan mudah. Saat ini Lembaga Penelitian dan Pengabdian kepada Masyarakat (LPPM) Politeknik Pos Indonesia sudah memiliki pangkalan data dalam proses pengelolaan data penelitian, pengabdian, publikasi dan HaKI berbasis web. Saat ini akses data APTIMAS masih terpusat dan tidak bisa diakses dengan menggunakan aplikasi selain aplikasi APTIMAS, sehingga dibutuhkan sebuah media middle ware atau web service yang memberikan solusi bagaiman aplikasi lain dapat mengkases data APTIMAS seperti untuk pengembangan aplikasi Mobile, dashboard di aplikasi lain, kebutuhan data lainnya. Oleh karena itu dibangunlah sebuah Web Service dengan arsitektur Representational State Transfer (REST) Application Programming Interface (API) yang berfungsi jembatan dalam memberikan layanan untuk komunikasi data. Dengan dibangunnya aplikasi middleware web service ini, diharapkan APTIMAS sebagai penyedia data penelitian, pengabdian, publikasi dan HaKI di lingkungan internal perguruan tinggi Politeknik Pos Inodoneisa dapat digunakan sebagai rujukan pengambilan data untuk pengembangan aplikasi lain dan kebutuhan data di lingkungan kampus Politeknik Pos Indonesia, tanpa harus mengakses langsung ke database APTIMAS


F1000Research ◽  
2021 ◽  
Vol 10 ◽  
pp. 1227
Author(s):  
Emmanuel Baldwin Mbaya ◽  
Babatunde Alao ◽  
Philip Ewejobi ◽  
Innocent Nwokolo ◽  
Victoria Oguntosin ◽  
...  

Background: In this work, a COVID19 Application Programming Interface (API) was built using the Representational State Transfer (REST) API architecture and it is designed to fetch data daily from the Nigerian Center for Disease Control (NCDC) website. Methods: The API is developed using ASP.NET Core Web API framework using C# programming language and Visual Studio 2019 as the Integrated Development Environment (IDE). The application has been deployed to Microsoft Azure as the cloud hosting platform and to successfully get new data from the NCDC website using Hangfire where a job has been scheduled to run every 12:30 pm (GMT + 1) and load the fetched data into our database. Various API Endpoints are defined to interact with the system and get data as needed, data can be fetched from a single state by name, all states on a particular day or over a range of days, etc. Results: The results from the data showed that Lagos and Abuja FCT in Nigeria were the hardest-hit states in terms of Total Confirmed cases while Lagos and Edo states had the highest death causalities with 465 and 186 as of August 2020. This analysis and many more can be easily made as a result of this API we have created that warehouses all COVID19 Data as presented by the NCDC since the first contracted case on February 29, 2020. This system was tested on the BlazeMeter platform, and it had an average of 11Hits/s with a response time of 2905milliseconds. Conclusions: The extension of NaijaCovidAPI over existing COVID19 APIs for Nigeria is the access and retrieval of previous data. Our contribution to the body of knowledge is the creation of a data hub for Nigeria's COVID-19 incidence from February 29, 2020, to date


Sign in / Sign up

Export Citation Format

Share Document