scholarly journals IoT Resource Allocation and Optimization Based on Heuristic Algorithm

Sensors ◽  
2020 ◽  
Vol 20 (2) ◽  
pp. 539 ◽  
Author(s):  
Arun Kumar Sangaiah ◽  
Ali Asghar Rahmani Hosseinabadi ◽  
Morteza Babazadeh Shareh ◽  
Seyed Yaser Bozorgi Rad ◽  
Atekeh Zolfagharian ◽  
...  

The Internet of Things (IoT) is a distributed system that connects everything via internet. IoT infrastructure contains multiple resources and gateways. In such a system, the problem of optimizing IoT resource allocation and scheduling (IRAS) is vital, because resource allocation (RA) and scheduling deals with the mapping between recourses and gateways and is also responsible for optimally allocating resources to available gateways. In the IoT environment, a gateway may face hundreds of resources to connect. Therefore, manual resource allocation and scheduling is not possible. In this paper, the whale optimization algorithm (WOA) is used to solve the RA problem in IoT with the aim of optimal RA and reducing the total communication cost between resources and gateways. The proposed algorithm has been compared to the other existing algorithms. Results indicate the proper performance of the proposed algorithm. Based on various benchmarks, the proposed method, in terms of “total communication cost”, is better than other ones.

2021 ◽  
Vol 21 (3) ◽  
pp. 1-22
Author(s):  
Celestine Iwendi ◽  
Saif Ur Rehman ◽  
Abdul Rehman Javed ◽  
Suleman Khan ◽  
Gautam Srivastava

In this digital age, human dependency on technology in various fields has been increasing tremendously. Torrential amounts of different electronic products are being manufactured daily for everyday use. With this advancement in the world of Internet technology, cybersecurity of software and hardware systems are now prerequisites for major business’ operations. Every technology on the market has multiple vulnerabilities that are exploited by hackers and cyber-criminals daily to manipulate data sometimes for malicious purposes. In any system, the Intrusion Detection System (IDS) is a fundamental component for ensuring the security of devices from digital attacks. Recognition of new developing digital threats is getting harder for existing IDS. Furthermore, advanced frameworks are required for IDS to function both efficiently and effectively. The commonly observed cyber-attacks in the business domain include minor attacks used for stealing private data. This article presents a deep learning methodology for detecting cyber-attacks on the Internet of Things using a Long Short Term Networks classifier. Our extensive experimental testing show an Accuracy of 99.09%, F1-score of 99.46%, and Recall of 99.51%, respectively. A detailed metric representing our results in tabular form was used to compare how our model was better than other state-of-the-art models in detecting cyber-attacks with proficiency.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 57192-57203 ◽  
Author(s):  
Yanhua He ◽  
Sunxuan Zhang ◽  
Liangrui Tang ◽  
Yun Ren

Author(s):  
Sun-ha Hong

Today, machines observe, record, and sense the world—not just for us but also often instead of us and indifferently to our meaning. The intertwined problems of technological knowledge and (our) knowledge of technology manifest in the growing industry of smart machines, the Internet of Things, and other means for self-tracking. The automation of the care of the self is buoyed by a popular fantasy of data’s intimacy, of machines that know you better than yourself. Yet as the technology becomes normalized, the hacker ethic gives way to a market-driven shift in which more and more of “my” personal truth is colonized by machines (and the people behind the machines) that I cannot question.


2020 ◽  
Vol 107 ◽  
pp. 498-508
Author(s):  
Kaile Xiao ◽  
Zhipeng Gao ◽  
Weisong Shi ◽  
Xuesong Qiu ◽  
Yang Yang ◽  
...  

2021 ◽  
Author(s):  
Malik bader alazzam ◽  
Fawaz Alassery

Abstract The Internet of Things (IoT) has subsequently been applied to a variety of sectors, including smart grids, farming, weather prediction, power generation, wastewater treatment, and so on. So if the Internet of Things has enormous promise in a wide range of applications, there still are certain areas where it may be improved. Designers had focused our present research on reducing the energy consumption of devices in IoT networks, which will result in a longer network lifetime. The far more suitable Cluster Head (CH) throughout the IoT system is determined in this study to optimize energy consumption. Whale Optimization Algorithm (WOA) with Evolutionary Algorithm (EA) is indeed a mixed meta-heuristic algorithm used during the suggested study. Various quantifiable metrics, including the variety of adult nodes, workload, temperatures, remaining energy, and a target value, were utilized IoT network groups. The suggested method then is contrasted to several cutting-edge optimization techniques, including the Artificial Bee Colony method, Neural Network, Adapted Gravity Simulated annealing. The findings show that the suggested hybrid method outperforms conventional methods.


2020 ◽  
Vol 117 (11) ◽  
pp. 5624-5630 ◽  
Author(s):  
Giulia Fanti ◽  
Nina Holden ◽  
Yuval Peres ◽  
Gireeja Ranade

Motivated by applications in wireless networks and the Internet of Things, we consider a model of n nodes trying to reach consensus with high probability on their majority bit. Each node i is assigned a bit at time 0 and is a finite automaton with m bits of memory (i.e.,2mstates) and a Poisson clock. When the clock of i rings, i can choose to communicate and is then matched to a uniformly chosen node j. The nodes j and i may update their states based on the state of the other node. Previous work has focused on minimizing the time to consensus and the probability of error, while our goal is minimizing the number of communications. We show that, whenm>3⁡log⁡log⁡log(n), consensus can be reached with linear communication cost, but this is impossible ifm<log⁡log⁡log(n). A key step is to distinguish when nodes can become aware of knowing the majority bit and stop communicating. We show that this is impossible if their memory is too low.


2018 ◽  
Vol 7 (2.7) ◽  
pp. 25 ◽  
Author(s):  
A Yasaswini ◽  
K V. DayaSagar ◽  
K ShriVishnu ◽  
V Hari Nandan ◽  
PVRD Prasadara Rao

Fog computing is used for reducing the complexity level of a network architecture and processes the data in a fog node IoT Hub. The Internet of Things (IoT) will connect billions of devices, such as smart objects, which are heterogeneous in terms of hardware software and communication interfaces. The IoT has to this point grown as numerous vertical idea of the IoT, rather than focusing at the real creation of a highly interoperable infrastructure for the development of applications. For managing the various devices present in the hub we create an IP-based infrastructure so that the elements can be able to balance the diversified devices and the network elements are used to enhance the direct end to end communication which is much required. With all the above considerations we propose a Fog node i.e. an IoT hub that can be placed at the end of multiple networks thereby increasing the capability by implementing various functions such as allocation of resources border routers cache proxy servers etc. As the implementation of fog node is through the IoT hub here we would like to automate the resource allocation that takes place in the IoT hub.


Sign in / Sign up

Export Citation Format

Share Document