Towards Ontology-based Data Quality Inference in Large-Scale Sensor Networks

Author(s):  
Sam Esswein ◽  
Sebastien Goasguen ◽  
Chris Post ◽  
Jason Hallstrom ◽  
David White ◽  
...  
Author(s):  
Abdelhady M. Naguib ◽  
Shahzad Ali

Background: Many applications of Wireless Sensor Networks (WSNs) require awareness of sensor node’s location but not every sensor node can be equipped with a GPS receiver for localization, due to cost and energy constraints especially for large-scale networks. For localization, many algorithms have been proposed to enable a sensor node to be able to determine its location by utilizing a small number of special nodes called anchors that are equipped with GPS receivers. In recent years a promising method that significantly reduces the cost is to replace the set of statically deployed GPS anchors with one mobile anchor node equipped with a GPS unit that moves to cover the entire network. Objectives: This paper proposes a novel static path planning mechanism that enables a single anchor node to follow a predefined static path while periodically broadcasting its current location coordinates to the nearby sensors. This new path type is called SQUARE_SPIRAL and it is specifically designed to reduce the collinearity during localization. Results: Simulation results show that the performance of SQUARE_SPIRAL mechanism is better than other static path planning methods with respect to multiple performance metrics. Conclusion: This work includes an extensive comparative study of the existing static path planning methods then presents a comparison of the proposed mechanism with existing solutions by doing extensive simulations in NS-2.


2009 ◽  
Vol 13 (1) ◽  
pp. 40-43
Author(s):  
Shaoliang Peng ◽  
Guoliang Xing ◽  
Shanshan Li ◽  
Weijia Jia ◽  
Yuxing Peng

Electronics ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 218
Author(s):  
Ala’ Khalifeh ◽  
Khalid A. Darabkh ◽  
Ahmad M. Khasawneh ◽  
Issa Alqaisieh ◽  
Mohammad Salameh ◽  
...  

The advent of various wireless technologies has paved the way for the realization of new infrastructures and applications for smart cities. Wireless Sensor Networks (WSNs) are one of the most important among these technologies. WSNs are widely used in various applications in our daily lives. Due to their cost effectiveness and rapid deployment, WSNs can be used for securing smart cities by providing remote monitoring and sensing for many critical scenarios including hostile environments, battlefields, or areas subject to natural disasters such as earthquakes, volcano eruptions, and floods or to large-scale accidents such as nuclear plants explosions or chemical plumes. The purpose of this paper is to propose a new framework where WSNs are adopted for remote sensing and monitoring in smart city applications. We propose using Unmanned Aerial Vehicles to act as a data mule to offload the sensor nodes and transfer the monitoring data securely to the remote control center for further analysis and decision making. Furthermore, the paper provides insight about implementation challenges in the realization of the proposed framework. In addition, the paper provides an experimental evaluation of the proposed design in outdoor environments, in the presence of different types of obstacles, common to typical outdoor fields. The experimental evaluation revealed several inconsistencies between the performance metrics advertised in the hardware-specific data-sheets. In particular, we found mismatches between the advertised coverage distance and signal strength with our experimental measurements. Therefore, it is crucial that network designers and developers conduct field tests and device performance assessment before designing and implementing the WSN for application in a real field setting.


2021 ◽  
Vol 11 (2) ◽  
pp. 214
Author(s):  
Anna Kaiser ◽  
Pascal-M. Aggensteiner ◽  
Martin Holtmann ◽  
Andreas Fallgatter ◽  
Marcel Romanos ◽  
...  

Electroencephalography (EEG) represents a widely established method for assessing altered and typically developing brain function. However, systematic studies on EEG data quality, its correlates, and consequences are scarce. To address this research gap, the current study focused on the percentage of artifact-free segments after standard EEG pre-processing as a data quality index. We analyzed participant-related and methodological influences, and validity by replicating landmark EEG effects. Further, effects of data quality on spectral power analyses beyond participant-related characteristics were explored. EEG data from a multicenter ADHD-cohort (age range 6 to 45 years), and a non-ADHD school-age control group were analyzed (ntotal = 305). Resting-state data during eyes open, and eyes closed conditions, and task-related data during a cued Continuous Performance Task (CPT) were collected. After pre-processing, general linear models, and stepwise regression models were fitted to the data. We found that EEG data quality was strongly related to demographic characteristics, but not to methodological factors. We were able to replicate maturational, task, and ADHD effects reported in the EEG literature, establishing a link with EEG-landmark effects. Furthermore, we showed that poor data quality significantly increases spectral power beyond effects of maturation and symptom severity. Taken together, the current results indicate that with a careful design and systematic quality control, informative large-scale multicenter trials characterizing neurophysiological mechanisms in neurodevelopmental disorders across the lifespan are feasible. Nevertheless, results are restricted to the limitations reported. Future work will clarify predictive value.


2021 ◽  
Vol 11 (2) ◽  
pp. 472
Author(s):  
Hyeongmin Cho ◽  
Sangkyun Lee

Machine learning has been proven to be effective in various application areas, such as object and speech recognition on mobile systems. Since a critical key to machine learning success is the availability of large training data, many datasets are being disclosed and published online. From a data consumer or manager point of view, measuring data quality is an important first step in the learning process. We need to determine which datasets to use, update, and maintain. However, not many practical ways to measure data quality are available today, especially when it comes to large-scale high-dimensional data, such as images and videos. This paper proposes two data quality measures that can compute class separability and in-class variability, the two important aspects of data quality, for a given dataset. Classical data quality measures tend to focus only on class separability; however, we suggest that in-class variability is another important data quality factor. We provide efficient algorithms to compute our quality measures based on random projections and bootstrapping with statistical benefits on large-scale high-dimensional data. In experiments, we show that our measures are compatible with classical measures on small-scale data and can be computed much more efficiently on large-scale high-dimensional datasets.


Sensors ◽  
2013 ◽  
Vol 13 (12) ◽  
pp. 17241-17264 ◽  
Author(s):  
Federico Domínguez ◽  
Nguyen The Cuong ◽  
Felipe Reinoso ◽  
Abdellah Touhafi ◽  
Kris Steenhaut

Sign in / Sign up

Export Citation Format

Share Document