scholarly journals Dynamic Transmission Rate Control for Multi-Interface IoT Devices: A Stochastic Optimization Framework

2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Yuming Zhang ◽  
Bohao Feng ◽  
Aleteng Tian ◽  
Chengxiao Yu ◽  
Zhiruo Liu ◽  
...  

Recent advances in the Internet of Things (IoT) technologies have enabled ubiquitous smart devices to sense and process various kinds of data. However, these innovations also raise the concern of efficient data transmission. Tackling the above issue is nontrivial since the resource constraints and environmental randomness in IoT require a lightweight transmission scheme while guaranteeing system stability. In this paper, we formulate the transmission scheduling problem of multi-interface IoT devices as a concave optimization, aimed at accommodating the randomness of the IoT environment within the network capacity. By applying the Lyapunov optimization technique, we divide the stochastic problem into a series of low-complex subproblems, which can be individually solved per time slot, and develop a dynamical control algorithm that does not require a priori knowledge such as link states. Theoretical analysis shows that our algorithms nicely bound the average queue length and are asymptotically optimal. Finally, extensive simulation results verify the theoretical conclusions and validate the effectiveness of the proposed algorithm.

Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8320
Author(s):  
Abebe Diro ◽  
Naveen Chilamkurti ◽  
Van-Doan Nguyen ◽  
Will Heyne

The Internet of Things (IoT) consists of a massive number of smart devices capable of data collection, storage, processing, and communication. The adoption of the IoT has brought about tremendous innovation opportunities in industries, homes, the environment, and businesses. However, the inherent vulnerabilities of the IoT have sparked concerns for wide adoption and applications. Unlike traditional information technology (I.T.) systems, the IoT environment is challenging to secure due to resource constraints, heterogeneity, and distributed nature of the smart devices. This makes it impossible to apply host-based prevention mechanisms such as anti-malware and anti-virus. These challenges and the nature of IoT applications call for a monitoring system such as anomaly detection both at device and network levels beyond the organisational boundary. This suggests an anomaly detection system is strongly positioned to secure IoT devices better than any other security mechanism. In this paper, we aim to provide an in-depth review of existing works in developing anomaly detection solutions using machine learning for protecting an IoT system. We also indicate that blockchain-based anomaly detection systems can collaboratively learn effective machine learning models to detect anomalies.


Author(s):  
Abidullha Adel ◽  
Md Sohel Rana ◽  
Nuruzzam Rana ◽  
Md Alamin Hosan ◽  
Mohammad Akbar Shapoor

Internet of Things (IoT) offers interconnection among several wireless communication devices for the provision of device accessibility and in-built capacity. IoT provides device interaction and provision of advantages capability for networking and socialization with consideration of intermediate devices. Through innovation in technology IoT devices convert cyber environments with hyper-connectivity. IoT communication contains several smart devices such as body sensors, smartphones, tags, electronic gadgets, and so on. IoT communication is involved in the provision of heterogeneous connectivity among devices for the provision of interface and connectivity for enhancing service quality. The data sending among IoT devices is affected by several threats that have an impact on the network’s performance. To overcome the limitation related to IoT communication, it is necessary to develop an appropriate technique for enhancing IoT network communication performance. In this research developed a multi-channel routing approach is adopted in IoT communication. The developed approach utilizes a meta-heuristics approach with probability-based characteristics. For the meta-heuristics approach this research utilizes whale optimization technique combined with probability characteristics for improving the IoT communication performance of the network. The proposed approach utilizes initially constructs the IoT communication path for information sharing and gathering. This path information is identified through the objective function of a meta-heuristic approach. Based on the objective function hoping between the devices is minimized through which data are transmitted in the network. Simulation is performed as a unique proposed approach with a coverage area of 100 meters. For identification of the optimal path in the network, WOA identifies the path of communication through probability function. Comparative analysis of research exhibited that WOA provides significant performance with the identification of optimal value at the range of 1.0746e-78. Further, the proposed probability-based WOA approach significantly improves the performance of the IoT network.


2021 ◽  
Vol 17 (3) ◽  
pp. 1-25
Author(s):  
Guangrong Zhao ◽  
Bowen Du ◽  
Yiran Shen ◽  
Zhenyu Lao ◽  
Lizhen Cui ◽  
...  

In this article, we propose, LeaD , a new vibration-based communication protocol to Lea rn the unique patterns of vibration to D ecode the short messages transmitted to smart IoT devices. Unlike the existing vibration-based communication protocols that decode the short messages symbol-wise, either in binary or multi-ary, the message recipient in LeaD receives vibration signals corresponding to bits-groups. Each group consists of multiple symbols sent in a burst and the receiver decodes the group of symbols as a whole via machine learning-based approach. The fundamental behind LeaD is different combinations of symbols (1 s or 0 s) in a group will produce unique and reproducible patterns of vibration. Therefore, decoding in vibration-based communication can be modeled as a pattern classification problem. We design and implement a number of different machine learning models as the core engine of the decoding algorithm of LeaD to learn and recognize the vibration patterns. Through the intensive evaluations on large amount of datasets collected, the Convolutional Neural Network (CNN)-based model achieves the highest accuracy of decoding (i.e., lowest error rate), which is up to 97% at relatively high bits rate of 40 bits/s. While its competing vibration-based communication protocols can only achieve transmission rate of 10 bits/s and 20 bits/s with similar decoding accuracy. Furthermore, we evaluate its performance under different challenging practical settings and the results show that LeaD with CNN engine is robust to poses, distances (within valid range), and types of devices, therefore, a CNN model can be generally trained beforehand and widely applicable for different IoT devices under different circumstances. Finally, we implement LeaD on both off-the-shelf smartphone and smart watch to measure the detailed resources consumption on smart devices. The computation time and energy consumption of its different components show that LeaD is lightweight and can run in situ on low-cost smart IoT devices, e.g., smartwatches, without accumulated delay and introduces only marginal system overhead.


Sensors ◽  
2020 ◽  
Vol 20 (8) ◽  
pp. 2231
Author(s):  
Daoqi Han ◽  
Xiaofeng Du ◽  
Yueming Lu

Resource constraints have prevented comprehensive cryptography and multifactor authentication in numerous Internet of Things (IoT) connectivity scenarios. Existing IoT systems generally adopt lightweight security protocols that lead to compromise and privacy leakage. Edge computing enables better access control and privacy protection, furthermore, blockchain architecture has achieved a trusted store of value by open-source and distributed consensus mechanisms. To embrace these new paradigms, we propose a scheme that employs one-time association multitasking proofs for peer to local authentication (OTMP-P2L). The scheme chooses relevant nondeterministic polynomial (NP) problem tasks, and manages localized trust and anonymity by using smart devices such as phones and pads, thereby enabling IoT devices to autonomously perform consensus validation with an enhanced message authentication code. This nested code is a one-time zero-knowledge proof that comprises multiple logic verification arguments. To increase diversity and reduce the workload of each one, these arguments are chained by a method that establishes some of the inputs of the following task from the output of previous tasks. We implemented a smart lock system and confirmed that the scheme outperforms IoT authentication methods. The result demonstrates superior flexibility through dynamic difficulty strategies and succinct non-interactive peer-to-peer (P2P) verification.


2021 ◽  
Vol 10 (1) ◽  
pp. 13
Author(s):  
Claudia Campolo ◽  
Giacomo Genovese ◽  
Antonio Iera ◽  
Antonella Molinaro

Several Internet of Things (IoT) applications are booming which rely on advanced artificial intelligence (AI) and, in particular, machine learning (ML) algorithms to assist the users and make decisions on their behalf in a large variety of contexts, such as smart homes, smart cities, smart factories. Although the traditional approach is to deploy such compute-intensive algorithms into the centralized cloud, the recent proliferation of low-cost, AI-powered microcontrollers and consumer devices paves the way for having the intelligence pervasively spread along the cloud-to-things continuum. The take off of such a promising vision may be hurdled by the resource constraints of IoT devices and by the heterogeneity of (mostly proprietary) AI-embedded software and hardware platforms. In this paper, we propose a solution for the AI distributed deployment at the deep edge, which lays its foundation in the IoT virtualization concept. We design a virtualization layer hosted at the network edge that is in charge of the semantic description of AI-embedded IoT devices, and, hence, it can expose as well as augment their cognitive capabilities in order to feed intelligent IoT applications. The proposal has been mainly devised with the twofold aim of (i) relieving the pressure on constrained devices that are solicited by multiple parties interested in accessing their generated data and inference, and (ii) and targeting interoperability among AI-powered platforms. A Proof-of-Concept (PoC) is provided to showcase the viability and advantages of the proposed solution.


Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4798
Author(s):  
Fangni Chen ◽  
Anding Wang ◽  
Yu Zhang ◽  
Zhengwei Ni ◽  
Jingyu Hua

With the increasing deployment of IoT devices and applications, a large number of devices that can sense and monitor the environment in IoT network are needed. This trend also brings great challenges, such as data explosion and energy insufficiency. This paper proposes a system that integrates mobile edge computing (MEC) technology and simultaneous wireless information and power transfer (SWIPT) technology to improve the service supply capability of WSN-assisted IoT applications. A novel optimization problem is formulated to minimize the total system energy consumption under the constraints of data transmission rate and transmitting power requirements by jointly considering power allocation, CPU frequency, offloading weight factor and energy harvest weight factor. Since the problem is non-convex, we propose a novel alternate group iteration optimization (AGIO) algorithm, which decomposes the original problem into three subproblems, and alternately optimizes each subproblem using the group interior point iterative algorithm. Numerical simulations validate that the energy consumption of our proposed design is much lower than the two benchmark algorithms. The relationship between system variables and energy consumption of the system is also discussed.


2018 ◽  
Vol 10 (3) ◽  
pp. 61-83 ◽  
Author(s):  
Deepali Chaudhary ◽  
Kriti Bhushan ◽  
B.B. Gupta

This article describes how cloud computing has emerged as a strong competitor against traditional IT platforms by offering low-cost and “pay-as-you-go” computing potential and on-demand provisioning of services. Governments, as well as organizations, have migrated their entire or most of the IT infrastructure to the cloud. With the emergence of IoT devices and big data, the amount of data forwarded to the cloud has increased to a huge extent. Therefore, the paradigm of cloud computing is no longer sufficient. Furthermore, with the growth of demand for IoT solutions in organizations, it has become essential to process data quickly, substantially and on-site. Hence, Fog computing is introduced to overcome these drawbacks of cloud computing by bringing intelligence to the edge of the network using smart devices. One major security issue related to the cloud is the DDoS attack. This article discusses in detail about the DDoS attack, cloud computing, fog computing, how DDoS affect cloud environment and how fog computing can be used in a cloud environment to solve a variety of problems.


Sensors ◽  
2018 ◽  
Vol 18 (8) ◽  
pp. 2548 ◽  
Author(s):  
Run Tian ◽  
Lin Ma ◽  
Zhe Wang ◽  
Xuezhi Tan

This paper considers interference management and capacity improvement for Internet of Things (IoT) oriented two-tier networks by exploiting cognition between network tiers with interference alignment (IA). More specifically, we target our efforts on the next generation two-tier networks, where a tier of femtocell serving multiple IoT devices shares the licensed spectrum with a tier of pre-existing macrocell via a cognitive radio. Aiming to manage the cross-tier interference caused by cognitive spectrum sharing as well as ensure an optimal capacity of the femtocell, two novel self-organizing cognitive IA schemes are proposed. First, we propose an interference nulling based cognitive IA scheme. In such a scheme, both co-tier and cross-tier interferences are aligned into the orthogonal subspace at each IoT receiver, which means all the interference can be perfectly eliminated without causing any performance degradation on the macrocell. However, it is known that the interference nulling based IA algorithm achieves its optimum only in high signal to noise ratio (SNR) scenarios, where the noise power is negligible. Consequently, when the imposed interference-free constraint on the femtocell can be relaxed, we also present a partial cognitive IA scheme that further enhances the network performance under a low and intermediate SNR. Additionally, the feasibility conditions and capacity analyses of the proposed schemes are provided. Both theoretical and numerical results demonstrate that the proposed cognitive IA schemes outperform the traditional orthogonal precoding methods in terms of network capacity, while preserving for macrocell users the desired quality of service.


2013 ◽  
Vol 470 ◽  
pp. 611-616
Author(s):  
Xuan Jie Ning ◽  
Hai Zhao ◽  
Mao Fan Yang ◽  
Dan Wu

This paper is concerned with the capacity of ad hoc networks employing pure ALOHA medium access control (MAC) protocol under the effect of different transmission power levels and variable data rate control. The data rate of a certain link is related to the signal to interference plus noise ratio (SINR), and SINR is, in turn, related to the transmitted power and link distance. The increasing power conducts a high data rate, resulting in the high interference of networks. Consequently, the optimum power that yields maximum network throughput is a tradeoff between transmission rate and network interference. Mathematical model analysis for the ad hoc network capacity are presented in the paper. A revised expression to the approximate calculating of the capture probability in networks is proposed.


Sign in / Sign up

Export Citation Format

Share Document