scholarly journals Evaluating the applicability of a low-cost sensor for measuring PM2.5 concentration in Ho Chi Minh city, Viet Nam

2019 ◽  
Vol 22 (3) ◽  
pp. 343-347
Author(s):  
Chi Doan Thien Nguyen ◽  
Hien Thi To

Introduction: Continuous monitoring provides real-time data which is helpful for measuring air quality; however, these systems are often very expensive, especially for developing countries such as Vietnam. The use of low-cost sensors for monitoring air pollution is a new approach in Vietnam and this study assesses the utility of low-cost, light-scattering-based, particulate sensors for measuring PM2.5 concentrations in Ho Chi Minh City. Methods: The low-cost sensors were compared with both a Beta attenuation monitor (BAM) reference method and a gravimetric method during the rainy season period of October to December 2018. Results: The results showed that there was a very strong correlation between two low-cost sensors (R = 0.97, slope = 1.0), and that the sensor precision varied from 0 to 21.4% with a mean of 3.1%. Both one-minute averaged data and one-hour averaged data showed similar correlations between sensors and BAM (R2 = 0.62 and 0.69, respectively), while 24-hour averaged data showed excellent agreement (R2 = 0.95, slope = 1.05). In addition, we also found a strong correlation between those instruments and a gravimetric method using 24-hour averaged data. A linear regression was used to calibrate the 24-hour averaged sensor data and, once calibrated, the bias dropped to zero. Conclusion: These results show that low-cost sensors can be used for daily measurements of PM2.5 concentrations in Ho Chi Minh City. The effect of air conditions, such as temperature and humidity, should be conducted. Moreover, technical methods to improve time resolution of lowcost sensors need to be developed and applied in order to provide real-time measurements at an inexpensive cost.  

2020 ◽  
Vol 10 (17) ◽  
pp. 5882
Author(s):  
Federico Desimoni ◽  
Sergio Ilarri ◽  
Laura Po ◽  
Federica Rollo ◽  
Raquel Trillo-Lado

Modern cities face pressing problems with transportation systems including, but not limited to, traffic congestion, safety, health, and pollution. To tackle them, public administrations have implemented roadside infrastructures such as cameras and sensors to collect data about environmental and traffic conditions. In the case of traffic sensor data not only the real-time data are essential, but also historical values need to be preserved and published. When real-time and historical data of smart cities become available, everyone can join an evidence-based debate on the city’s future evolution. The TRAFAIR (Understanding Traffic Flows to Improve Air Quality) project seeks to understand how traffic affects urban air quality. The project develops a platform to provide real-time and predicted values on air quality in several cities in Europe, encompassing tasks such as the deployment of low-cost air quality sensors, data collection and integration, modeling and prediction, the publication of open data, and the development of applications for end-users and public administrations. This paper explicitly focuses on the modeling and semantic annotation of traffic data. We present the tools and techniques used in the project and validate our strategies for data modeling and its semantic enrichment over two cities: Modena (Italy) and Zaragoza (Spain). An experimental evaluation shows that our approach to publish Linked Data is effective.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 50
Author(s):  
Steve H. L. Liang ◽  
Sara Saeedi ◽  
Soroush Ojagh ◽  
Sepehr Honarparvar ◽  
Sina Kiaei ◽  
...  

To safely protect workplaces and the workforce during and after the COVID-19 pandemic, a scalable integrated sensing solution is required in order to offer real-time situational awareness and early warnings for decision-makers. However, an information-based solution for industry reopening is ineffective when the necessary operational information is locked up in disparate real-time data silos. There is a lot of ongoing effort to combat the COVID-19 pandemic using different combinations of low-cost, location-based contact tracing, and sensing technologies. These ad hoc Internet of Things (IoT) solutions for COVID-19 were developed using different data models and protocols without an interoperable way to interconnect these heterogeneous systems and exchange data on people and place interactions. This research aims to design and develop an interoperable Internet of COVID-19 Things (IoCT) architecture that is able to exchange, aggregate, and reuse disparate IoT sensor data sources in order for informed decisions to be made after understanding the real-time risks in workplaces based on person-to-place interactions. The IoCT architecture is based on the Sensor Web paradigm that connects various Things, Sensors, and Datastreams with an indoor geospatial data model. This paper presents a study of what, to the best of our knowledge, is the first real-world integrated implementation of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) and IndoorGML standards to calculate the risk of COVID-19 online using a workplace reopening case study. The proposed IoCT offers a new open standard-based information model, architecture, methodologies, and software tools that enable the interoperability of disparate COVID-19 monitoring systems with finer spatial-temporal granularity. A workplace cleaning use case was developed in order to demonstrate the capabilities of this proposed IoCT architecture. The implemented IoCT architecture included proximity-based contact tracing, people density sensors, a COVID-19 risky behavior monitoring system, and the contextual building geospatial data.


2021 ◽  
Vol 13 (8) ◽  
pp. 4496
Author(s):  
Giuseppe Desogus ◽  
Emanuela Quaquero ◽  
Giulia Rubiu ◽  
Gianluca Gatto ◽  
Cristian Perra

The low accessibility to the information regarding buildings current performances causes deep difficulties in planning appropriate interventions. Internet of Things (IoT) sensors make available a high quantity of data on energy consumptions and indoor conditions of an existing building that can drive the choice of energy retrofit interventions. Moreover, the current developments in the topic of the digital twin are leading the diffusion of Building Information Modeling (BIM) methods and tools that can provide valid support to manage all data and information for the retrofit process. This paper shows the aim and the findings of research focused on testing the integrated use of BIM methodology and IoT systems. A common data platform for the visualization of building indoor conditions (e.g., temperature, luminance etc.) and of energy consumption parameters was carried out. This platform, tested on a case study located in Italy, is developed with the integration of low-cost IoT sensors and the Revit model. To obtain a dynamic and automated exchange of data between the sensors and the BIM model, the Revit software was integrated with the Dynamo visual programming platform and with a specific Application Programming Interface (API). It is an easy and straightforward tool that can provide building managers with real-time data and information about the energy consumption and the indoor conditions of buildings, but also allows for viewing of the historical sensor data table and creating graphical historical sensor data. Furthermore, the BIM model allows the management of other useful information about the building, such as dimensional data, functions, characteristics of the components of the building, maintenance status etc., which are essential for a much more conscious, effective and accurate management of the building and for defining the most suitable retrofit scenarios.


2021 ◽  
Author(s):  
Goedele Verreydt ◽  
Niels Van Putte ◽  
Timothy De Kleyn ◽  
Joris Cool ◽  
Bino Maiheu

<p>Groundwater dynamics play a crucial role in the spreading of a soil and groundwater contamination. However, there is still a big gap in the understanding of the groundwater flow dynamics. Heterogeneities and dynamics are often underestimated and therefore not taken into account. They are of crucial input for successful management and remediation measures. The bulk of the mass of mass often is transported through only a small layer or section within the aquifer and is in cases of seepage into surface water very dependent to rainfall and occurring tidal effects.</p><p> </p><p>This study contains the use of novel real-time iFLUX sensors to map the groundwater flow dynamics over time. The sensors provide real-time data on groundwater flow rate and flow direction. The sensor probes consist of multiple bidirectional flow sensors that are superimposed. The probes can be installed directly in the subsoil, riverbed or monitoring well. The measurement setup is unique as it can perform measurements every second, ideal to map rapid changing flow conditions. The measurement range is between 0,5 and 500 cm per day.</p><p> </p><p>We will present the measurement principles and technical aspects of the sensor, together with two case studies.</p><p> </p><p>The first case study comprises the installation of iFLUX sensors in 4 different monitoring wells in a chlorinated solvent plume to map on the one hand the flow patterns in the plume, and on the other hand the flow dynamics that are influenced by the nearby popular trees. The foreseen remediation concept here is phytoremediation. The sensors were installed for a period of in total 4 weeks. Measurement frequency was 5 minutes. The flow profiles and time series will be presented together with the determined mass fluxes.</p><p> </p><p>A second case study was performed on behalf of the remediation of a canal riverbed. Due to industrial production of tar and carbon black in the past, the soil and groundwater next to the small canal ‘De Lieve’ in Ghent, Belgium, got contaminated with aliphatic and (poly)aromatic hydrocarbons. The groundwater contaminants migrate to the canal, impact the surface water quality and cause an ecological risk. The seepage flow and mass fluxes of contaminants into the surface water were measured with the novel iFLUX streambed sensors, installed directly in the river sediment. A site conceptual model was drawn and dimensioned based on the sensor data. The remediation concept to tackle the inflowing pollution: a hydraulic conductive reactive mat on the riverbed that makes use of the natural draining function of the waterbody, the adsorption capacity of a natural or secondary adsorbent and a future habitat for micro-organisms that biodegrade contaminants. The reactive mats were successfully installed and based on the mass flux calculations a lifespan of at least 10 years is expected for the adsorption material.  </p>


2018 ◽  
Vol 210 ◽  
pp. 03008
Author(s):  
Aparajita Das ◽  
Manash Pratim Sarma ◽  
Kandarpa Kumar Sarma ◽  
Nikos Mastorakis

This paper describes the design of an operative prototype based on Internet of Things (IoT) concepts for real time monitoring of various environmental conditions using certain commonly available and low cost sensors. The various environmental conditions such as temperature, humidity, air pollution, sun light intensity and rain are continuously monitored, processed and controlled by an Arduino Uno microcontroller board with the help of several sensors. Captured data are broadcasted through internet with an ESP8266 Wi-Fi module. The projected system delivers sensors data to an API called ThingSpeak over an HTTP protocol and allows storing of data. The proposed system works well and it shows reliability. The prototype has been used to monitor and analyse real time data using graphical information of the environment.


2020 ◽  
Vol 17 (3) ◽  
pp. 867-890
Author(s):  
Jun-Hee Choi ◽  
Hyun-Sug Cho

The gravimetric method, which is mainly used among particulate matter (PM) measurement methods, includes the disadvantages that it cannot measure PM in real time and it requires expensive equipment. To overcome these disadvantages, we have developed a light scattering type PM sensor that can be manufactured at low cost and can measure PM in real time. We have built a big data system that can systematically store and analyze the data collected through the developed sensor, as well as an environment where PM states can be monitored mobile in real time using such data. In addition, additional studies were conducted to analyze and correct the collected big data to overcome the problem of low accuracy, which is a disadvantage of the light scattering type PM sensor. We used a linear correction method and proceeded to adopt the most suitable value based on error and accuracy.


2020 ◽  
Vol 12 (23) ◽  
pp. 10175
Author(s):  
Fatima Abdullah ◽  
Limei Peng ◽  
Byungchul Tak

The volume of streaming sensor data from various environmental sensors continues to increase rapidly due to wider deployments of IoT devices at much greater scales than ever before. This, in turn, causes massive increase in the fog, cloud network traffic which leads to heavily delayed network operations. In streaming data analytics, the ability to obtain real time data insight is crucial for computational sustainability for many IoT enabled applications such as environmental monitors, pollution and climate surveillance, traffic control or even E-commerce applications. However, such network delays prevent us from achieving high quality real-time data analytics of environmental information. In order to address this challenge, we propose the Fog Sampling Node Selector (Fossel) technique that can significantly reduce the IoT network and processing delays by algorithmically selecting an optimal subset of fog nodes to perform the sensor data sampling. In addition, our technique performs a simple type of query executions within the fog nodes in order to further reduce the network delays by processing the data near the data producing devices. Our extensive evaluations show that Fossel technique outperforms the state-of-the-art in terms of latency reduction as well as in bandwidth consumption, network usage and energy consumption.


2020 ◽  
Vol 12 (13) ◽  
pp. 5368
Author(s):  
Tomasz Owczarek ◽  
Mariusz Rogulski ◽  
Piotr O. Czechowski

The aim of the work is to demonstrate the possibility of building models to correct the results of measurements of particulate matter PM10 concentrations obtained using low-cost devices. Such devices apply the optical method to values comparable with those obtained using the reference gravimetric method. An additional goal is to show that the results corrected in this way can be used to carry out the procedure for testing equivalence of these methods. The study used generalized regression models (GRMs) to construct corrective functions. The constructed models were assessed using the coefficients of determination and the methodology of calculating the measurement uncertainty of the device. Measurement data from the two tested devices and the reference method were used to estimate model parameters. The measurement data were collected on a daily basis from 1 February to 30 June 2018 in Nowy Sącz. Regression allowed building multiple models with various functional forms and very promising statistical properties as well as good ability to describe the variability of reference measurements. These models also had very low values of measurement uncertainty. Of all the models constructed, a linear model using the original PM10 concentrations from the tested devices, air humidity, and wind speed was chosen as the most accurate and simplest model. Apart from the coefficient of determination, expanded relative uncertainty served as the measure of quality of the obtained model. Its small value, much lower than 25%, indicates that after correcting the results it is possible to carry out the equivalence testing procedure for the low-cost devices and confirm the equivalence of the tested method with the reference method.


2020 ◽  
Vol 10 (20) ◽  
pp. 7054 ◽  
Author(s):  
Muzaffar Rao ◽  
Liam Lynch ◽  
James Coady ◽  
Daniel Toal ◽  
Thomas Newe

Industry 4.0 uses the analysis of real-time data, artificial intelligence, automation, and the interconnection of components of the production lines to improve manufacturing efficiency and quality. Manufacturing Execution Systems (MESs) and Autonomous Intelligent Vehicles (AIVs) are key elements of Industry 4.0 implementations. An MES connects, monitors, and controls data flows on the factory floor, while automation is achieved by using AIVs. The Robot Operating System (ROS) built AIVs are targeted here. To facilitate MES and AIV interactions, there is a need to integrate the MES and the AIVs to help in building an automated and interconnected manufacturing environment. This integration needs middleware, which understands both MES and AIVs. To address this issue, a LabVIEW-based scheduler is proposed here as the middleware. LabVIEW communicates with the MES through webservices and has support for ROS. The main task of the scheduler is to control the AIV based on MES requests. The scheduler developed was tested in a real factory environment using the SAP MES and a Robotnik ‘RB-1′ robot. The scheduler interface provides real-time information about the current status of the MES, AIV, and the current stage of scheduler processing. The proposed scheduler provides an efficient automated product delivery system that transports the product from process cell to process cell using the AIV, based on the production sequences defined by the MES. In addition, using the proposed scheduler, integration of an MES is possible with any low-cost ROS-built AIV.


2017 ◽  
Vol 237 (4) ◽  
pp. 329-341
Author(s):  
Kevin Kovacs ◽  
Bryan Boulier ◽  
Herman Stekler

Abstract Historically, forecasters have failed to predict cyclical turning points and the forecasting record in this regard has not improved. This suggests that we should focus on what should be an easier task, recognizing recessions as they occur. We present a new approach that will enable us to determine in real-time when there is a significant deviation from an economy’s dynamic growth path. This approach uses a CUSUM-like methodology and requires us to construct an index,that we call the Economic News Index, from real-time data that shows how the economy is functioning. We apply this approach to German data to nowcast the recessions that began in 2008 and 2012.


Sign in / Sign up

Export Citation Format

Share Document