scholarly journals Energy-Aware Sensing on Battery-Less LoRaWAN Devices with Energy Harvesting

Electronics ◽  
2020 ◽  
Vol 9 (6) ◽  
pp. 904 ◽  
Author(s):  
Adnan Sabovic ◽  
Carmen Delgado ◽  
Dragan Subotic ◽  
Bart Jooris ◽  
Eli De Poorter ◽  
...  

Billions of Internet of Things (IoT) devices rely on batteries as the main power source. These batteries are short-lived, bulky and harmful to the environment. Battery-less devices provide a promising alternative for a sustainable IoT, where energy harvested from the environment is stored in small capacitors. This constrained energy storage and the unpredictable energy harvested result in intermittent on–off behavior of the device. Measuring and understanding the current consumption and execution time of different tasks of IoT applications is crucial to properly operate these battery-less devices. In this paper, we study how to properly schedule sensing and transmission tasks on a battery-less LoRaWAN device. We analyze the trade-off between sleeping and allowing the device to turn off between the execution of application tasks. This study allows us to properly define the device configuration (i.e., capacitor size) based on the application tasks (i.e., sensing and sending) and environmental conditions (i.e., harvesting rate). We define an optimization problem that determines the optimal capacitor voltage at which the device should start performing its tasks. Our results show that a device using LoRaWAN Class A can measure the temperature and transmit its data at least once every 5 s if it can harvest at least 10 mA of current and uses a relatively small capacitor of 10 mF or less. At harvesting rates below 3 mA, it is necessary to turn off the device between application cycles and use a larger supercapacitor of at least 140 mF. In this case, the device can transmit a temperature measurement once every 60–100 s.

2021 ◽  
Vol 17 (3) ◽  
pp. 1-23
Author(s):  
Borui Li ◽  
Wei Dong ◽  
Gaoyang Guan ◽  
Jiadong Zhang ◽  
Tao Gu ◽  
...  

Many IoT applications have the requirements of conducting complex IoT events processing (e.g., speech recognition) that are hardly supported by low-end IoT devices due to limited resources. Most existing approaches enable complex IoT event processing on low-end IoT devices by statically allocating tasks to the edge or the cloud. In this article, we present Queec, a QoE-aware edge computing system for complex IoT event processing under dynamic workloads. With Queec, the complex IoT event processing tasks that are relatively computation-intensive for low-end IoT devices can be transparently offloaded to nearby edge nodes at runtime. We formulate the problem of scheduling multi-user tasks to multiple edge nodes as an optimization problem, which minimizes the overall offloading latency of all tasks while avoiding the overloading problem. We implement Queec on low-end IoT devices, edge nodes, and the cloud. We conduct extensive evaluations, and the results show that Queec reduces 56.98% of the offloading latency on average compared with the state-of-the-art under dynamic workloads, while incurring acceptable overhead.


2021 ◽  
Vol 1 (1) ◽  
pp. 1-14
Author(s):  
Alem Čolaković ◽  
Adisa Hasković Džubur ◽  
Bakir Karahodža

Internet of Things (IoT) is the inter-networking paradigm based on many processes such as identifying, sensing, networking and computation. An IoT technology stack provides seamless connectivity between various physical and virtual objects. The increasing number of IoT applications leads to the issue of transmitting, storing, and processing a large amount of data. Therefore, it is necessary to enable a system capable to handle the growing traffic requirements with the required level of QoS (Quality of Service). IoT devices become more complex due to the various components such as sensors and network interfaces. The IoT environment is often demanding for mobile power source, QoS, mobility, reliability, security, and other requirements. Therefore, new IoT technologies are required to overcome some of these issues. In recent years new wireless communication technologies are being developed to support the development of new IoT applications. This paper provides an overview of some of the most widely used wireless communication technologies used for IoT applications.


Network ◽  
2022 ◽  
Vol 2 (1) ◽  
pp. 36-52
Author(s):  
Miguel Rosendo ◽  
Jorge Granjal

The constant evolution in communication infrastructures will enable new Internet of Things (IoT) applications, particularly in areas that, up to today, have been mostly enabled by closed or proprietary technologies. Such applications will be enabled by a myriad of wireless communication technologies designed for all types of IoT devices, among which are the Long-Range Wide-Area Network (LoRaWAN) or other Low-power and Wide-Area Networks (LPWAN) communication technologies. This applies to many critical environments, such as industrial control and healthcare, where wireless communications are yet to be broadly adopted. Two fundamental requirements to effectively support upcoming critical IoT applications are those of energy management and security. We may note that those are, in fact, contradictory goals. On the one hand, many IoT devices depend on the usage of batteries while, on the other hand, adequate security mechanisms need to be in place to protect devices and communications from threats against their stability and security. With thismotivation in mind, we propose a solution to address the management, in tandem, of security and energy in LoRaWAN IoT communication environments. We propose and evaluate an architecture in the context of which adaptation logic is used to manage security and energy dynamically, with the goal of guaranteeing appropriate security, while promoting the lifetime of constrained sensing devices. The proposed solution was implemented and experimentally evaluated and was observed to successfully manage security and energy. Security and energy are managed in line with the requirements of the application at hand, the characteristics of the constrained sensing devices employed and the detection, as well as the threat, of particular types of attacks.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Pablo Sanabria ◽  
Tomás Felipe Tapia ◽  
Andres Neyem ◽  
Jose Ignacio Benedetto ◽  
Matías Hirsch ◽  
...  

Mobile grid computing has been a popular topic for researchers due to mobile and IoT devices’ ubiquity and their evergrowing processing potential. While many scheduling algorithms for harnessing these resources exist in the literature for standard grid computing scenarios, surprisingly, there is little insight into this matter in the context of hybrid-powered computing resources, typically found in Dew and Edge computing environments. This paper proposes new algorithms aware of devices’ power source for scheduling tasks in hybrid environments, i.e., where the battery- and non-battery-powered devices cooperate. We simulated hybrid Dew/Edge environments by extending DewSim, a simulator that models battery-driven devices’ battery behavior using battery traces profiled from real mobile devices. We compared the throughput and job completion achieved by algorithms proposed in this paper using as a baseline a previously developed algorithm that considers computing resources but only from battery-dependent devices called Enhanced Simple Energy-Aware Schedule (E-SEAS). The obtained results in the simulation reveal that our proposed algorithms can obtain up to a 90% increment in overall throughput and around 95% of completed jobs in hybrid environments compared to E-SEAS. Finally, we show that incorporating these characteristics gives more awareness of the type of resources present and can enable the algorithms to manage resources more efficiently in more hybrid environments than other algorithms found in the literature.


2021 ◽  
Vol 10 (1) ◽  
pp. 13
Author(s):  
Claudia Campolo ◽  
Giacomo Genovese ◽  
Antonio Iera ◽  
Antonella Molinaro

Several Internet of Things (IoT) applications are booming which rely on advanced artificial intelligence (AI) and, in particular, machine learning (ML) algorithms to assist the users and make decisions on their behalf in a large variety of contexts, such as smart homes, smart cities, smart factories. Although the traditional approach is to deploy such compute-intensive algorithms into the centralized cloud, the recent proliferation of low-cost, AI-powered microcontrollers and consumer devices paves the way for having the intelligence pervasively spread along the cloud-to-things continuum. The take off of such a promising vision may be hurdled by the resource constraints of IoT devices and by the heterogeneity of (mostly proprietary) AI-embedded software and hardware platforms. In this paper, we propose a solution for the AI distributed deployment at the deep edge, which lays its foundation in the IoT virtualization concept. We design a virtualization layer hosted at the network edge that is in charge of the semantic description of AI-embedded IoT devices, and, hence, it can expose as well as augment their cognitive capabilities in order to feed intelligent IoT applications. The proposal has been mainly devised with the twofold aim of (i) relieving the pressure on constrained devices that are solicited by multiple parties interested in accessing their generated data and inference, and (ii) and targeting interoperability among AI-powered platforms. A Proof-of-Concept (PoC) is provided to showcase the viability and advantages of the proposed solution.


Electronics ◽  
2021 ◽  
Vol 10 (16) ◽  
pp. 1876
Author(s):  
Ioana Apostol ◽  
Marius Preda ◽  
Constantin Nila ◽  
Ion Bica

The Internet of Things has become a cutting-edge technology that is continuously evolving in size, connectivity, and applicability. This ecosystem makes its presence felt in every aspect of our lives, along with all other emerging technologies. Unfortunately, despite the significant benefits brought by the IoT, the increased attack surface built upon it has become more critical than ever. Devices have limited resources and are not typically created with security features. Lately, a trend of botnet threats transitioning to the IoT environment has been observed, and an army of infected IoT devices can expand quickly and be used for effective attacks. Therefore, identifying proper solutions for securing IoT systems is currently an important and challenging research topic. Machine learning-based approaches are a promising alternative, allowing the identification of abnormal behaviors and the detection of attacks. This paper proposes an anomaly-based detection solution that uses unsupervised deep learning techniques to identify IoT botnet activities. An empirical evaluation of the proposed method is conducted on both balanced and unbalanced datasets to assess its threat detection capability. False-positive rate reduction and its impact on the detection system are also analyzed. Furthermore, a comparison with other unsupervised learning approaches is included. The experimental results reveal the performance of the proposed detection method.


Internet of Things(IoT) is playing a pivotal role in our daily life as well as in various fields like Health, agriculture, industries etc. In the go, the data in the various IoT applications will be easily available to the physical dominion and thus the process of ensuringthe security of the data will be a major concern. For the extensive implementation of the numerous applications of IoT , the data security is a critical component. In our work, we have developed an encryption technique to secure the data of IoT. With the help of Merkle-Hellman encryption the data collected from the various IoT devices are first of all encrypted and then the secret message is generated with the help of Elliptic Curve Cryptography.


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Zakaria Mahlaoui ◽  
Eva Antonino-Daviu ◽  
Miguel Ferrando-Bataller

Based on the characteristic mode theory, a versatile radiation pattern reconfigurable antenna is proposed. The analysis starts from two parallel metallic plates with the same and different dimensions. By means of two PIN diodes, the size of one of the parallel metallic plates can be modified and consequently the behavior of the radiation pattern can be switched between bidirectional and unidirectional radiation patterns. Moreover, a SPDT switch is used to adjust the frequency and match the input impedance. The reconfigurable antenna prototype has been assembled and tested, and a good agreement between simulated and measured results is obtained at 2.5 GHz band which fits the IoT applications.


Sign in / Sign up

Export Citation Format

Share Document