scholarly journals SoftSystem: Smart Edge Computing Device Selection Method for IoT Based on Soft Set Technique

2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Muhammad Shafiq ◽  
Zhihong Tian ◽  
Ali Kashif Bashir ◽  
Korhan Cengiz ◽  
Adnan Tahir

The Internet of Things (IoT) is growing day by day, and new IoT devices are introduced and interconnected. Due to this rapid growth, IoT faces several issues related to communication in the edge computing network. The critical issue in these networks is the effective edge computing IoT device selection whenever there are several edge nodes to carry information. To overcome this problem, in this paper, we proposed a new framework model named SoftSystem based on the soft set technique that recommends useful IIoT devices. Then, we proposed an algorithm named Softsystemalgo. For the proposed system, three different parameters are selected: IoT Device Security (IDSC), IoT Device Storage (IDST), and IoT Device Communication Speed (IDCS). We also find out the most significant parameters from the given set of parameters. It is evident that our proposed system is effective for the selection of edge computing devices in the IoT network.

Sensors ◽  
2019 ◽  
Vol 19 (20) ◽  
pp. 4375 ◽  
Author(s):  
Yuxuan Wang ◽  
Jun Yang ◽  
Xiye Guo ◽  
Zhi Qu

As one of the information industry’s future development directions, the Internet of Things (IoT) has been widely used. In order to reduce the pressure on the network caused by the long distance between the processing platform and the terminal, edge computing provides a new paradigm for IoT applications. In many scenarios, the IoT devices are distributed in remote areas or extreme terrain and cannot be accessed directly through the terrestrial network, and data transmission can only be achieved via satellite. However, traditional satellites are highly customized, and on-board resources are designed for specific applications rather than universal computing. Therefore, we propose to transform the traditional satellite into a space edge computing node. It can dynamically load software in orbit, flexibly share on-board resources, and provide services coordinated with the cloud. The corresponding hardware structure and software architecture of the satellite is presented. Through the modeling analysis and simulation experiments of the application scenarios, the results show that the space edge computing system takes less time and consumes less energy than the traditional satellite constellation. The quality of service is mainly related to the number of satellites, satellite performance, and task offloading strategy.


2021 ◽  
Vol 12 (1) ◽  
pp. 140
Author(s):  
Seunghwan Lee ◽  
Linh-An Phan ◽  
Dae-Heon Park ◽  
Sehan Kim ◽  
Taehong Kim

With the exponential growth of the Internet of Things (IoT), edge computing is in the limelight for its ability to quickly and efficiently process numerous data generated by IoT devices. EdgeX Foundry is a representative open-source-based IoT gateway platform, providing various IoT protocol services and interoperability between them. However, due to the absence of container orchestration technology, such as automated deployment and dynamic resource management for application services, EdgeX Foundry has fundamental limitations of a potential edge computing platform. In this paper, we propose EdgeX over Kubernetes, which enables remote service deployment and autoscaling to application services by running EdgeX Foundry over Kubernetes, which is a product-grade container orchestration tool. Experimental evaluation results prove that the proposed platform increases manageability through the remote deployment of application services and improves the throughput of the system and service quality with real-time monitoring and autoscaling.


2018 ◽  
Vol 6 (4) ◽  
pp. 117-131
Author(s):  
Matt Sinda ◽  
Tyler Danner ◽  
Sean O'Neill ◽  
Abeer Alqurashi ◽  
Haeng-Kon Kim

The Internet of Things (IoT) is becoming more pervasive in our daily lives and is being used to add conveniences to our everyday items. There are several standards that are allowing these devices to communicate with each other and ultimately, with our mobile devices. However, in a rush to meet market demand, security was not considered until after the device had already been placed on the market. Most of the work done in improving security has been in the area of encryption. However, with the relatively small footprint of IoT devices, this makes strong encryption difficult. The authors' method will show that the current algorithm used to determine the next Bluetooth frequency hop is vulnerable to attack, and will suggest a novel algorithm to more securely select the next frequency to use. They will simulate their solution algorithmically to showcase their approach and in so doing demonstrate that it moves to the next frequency in a more random pattern than the existing model achieves. In this article, the authors present a new framework for improving security that focuses on the timing of frequency hopping, particularly in Bluetooth. The results show that focusing on different timing sequences for how long a device stays on a particular frequency both fits the current Bluetooth Lite architecture and provides adequate security for IoT devices, as it is demonstrably more random that the existing architecture.


Cybersecurity ◽  
2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Runchen Gao ◽  
Shen Li ◽  
Yuqi Gao ◽  
Rui Guo

AbstractWith the large-scale application of 5G in industrial production, the Internet of Things has become an important technology for various industries to achieve efficiency improvement and digital transformation with the help of the mobile edge computing. In the modern industry, the user often stores data collected by IoT devices in the cloud, but the data at the edge of the network involves a large of the sensitive information, which increases the risk of privacy leakage. In order to address these two challenges, we propose a security strategy in the edge computing. Our security strategy combines the Feistel architecture and short comparable encryption based on sliding window (SCESW). Compared to existing security strategies, our proposed security strategy guarantees its security while significantly reducing the computational overhead. And our GRC algorithm can be successfully deployed on a hardware platform.


Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 973
Author(s):  
Tianyi Liu ◽  
Ruyu Luo ◽  
Fangmin Xu ◽  
Chaoqiong Fan ◽  
Chenglin Zhao

With the development of global urbanization, the Internet of Things (IoT) and smart cities are becoming hot research topics. As an emerging model, edge computing can play an important role in smart cities because of its low latency and good performance. IoT devices can reduce time consumption with the help of a mobile edge computing (MEC) server. However, if too many IoT devices simultaneously choose to offload the computation tasks to the MEC server via the limited wireless channel, it may lead to the channel congestion, thus increasing time overhead. Facing a large number of IoT devices in smart cities, the centralized resource allocation algorithm needs a lot of signaling exchange, resulting in low efficiency. To solve the problem, this paper studies the joint policy of communication and computing of IoT devices in edge computing through game theory, and proposes distributed Q-learning algorithms with two learning policies. Simulation results show that the algorithm can converge quickly with a balanced solution.


Considering the large number of devices connected to the Internet of Things(IoT), identifying malicious devices for the purpose of “search & seizure” remainsa critical issue for digital investigators. Consequently, the need for techniques that automatically identify malicious devices can speed up the process of digital investigation. However, few conceptual approaches were proposed to identify malicious devices during IoT forensic investigation. To overcome that, a formal approach is proposed to automatically triage and fingerprint IoT Malicious devices with their respective states. It is expected that with the proposed formal approach, investigators can simply identify malicious devices, their states as well as determine the scope of investigation.


2021 ◽  
pp. 39-45
Author(s):  
Yabin Wang ◽  
◽  
Jing Yu

The emergence of edge computing makes up for the limited capacity of devices. By migrating intensive computing tasks from them to edge nodes (EN), we can save more energy while still maintaining the quality of service.Computing offload decision involves collaboration and complex resource management. It should be determined in real time according to dynamic workload and network environment. The simulation experiment method is used to maximize the long-term utility by deploying deep reinforcement learning agents on IOT devices and edge nodes, and the alliance learning is introduced to distribute the deep reinforcement learning agents. First, build the Internet of things system supporting edge computing, download the existing model from the edge node for training, and unload the intensive computing task to the edge node for training; upload the updated parameters to the edge node, and the edge node aggregates the parameters with the The model at the edge nodecan get a new model; the cloud can get a new model at the edge node and aggregate, and can also get updated parameters from the edge node to apply to the device.


Impact ◽  
2019 ◽  
Vol 2019 (10) ◽  
pp. 61-63 ◽  
Author(s):  
Akihiro Fujii

The Internet of Things (IoT) is a term that describes a system of computing devices, digital machines, objects, animals or people that are interrelated. Each of the interrelated 'things' are given a unique identifier and the ability to transfer data over a network that does not require human-to-human or human-to-computer interaction. Examples of IoT in practice include a human with a heart monitor implant, an animal with a biochip transponder (an electronic device inserted under the skin that gives the animal a unique identification number) and a car that has built-in sensors which can alert the driver about any problems, such as when the type pressure is low. The concept of a network of devices was established as early as 1982, although the term 'Internet of Things' was almost certainly first coined by Kevin Ashton in 1999. Since then, IoT devices have become ubiquitous, certainly in some parts of the world. Although there have been significant developments in the technology associated with IoT, the concept is far from being fully realised. Indeed, the potential for the reach of IoT extends to areas which some would find surprising. Researchers at the Faculty of Science and Engineering, Hosei University in Japan, are exploring using IoT in the agricultural sector, with some specific work on the production of melons. For the advancement of IoT in agriculture, difficult and important issues are implementation of subtle activities into computers procedure. The researchers challenges are going on.


Author(s):  
Jaber Almutairi ◽  
Mohammad Aldossary

AbstractRecently, the number of Internet of Things (IoT) devices connected to the Internet has increased dramatically as well as the data produced by these devices. This would require offloading IoT tasks to release heavy computation and storage to the resource-rich nodes such as Edge Computing and Cloud Computing. Although Edge Computing is a promising enabler for latency-sensitive related issues, its deployment produces new challenges. Besides, different service architectures and offloading strategies have a different impact on the service time performance of IoT applications. Therefore, this paper presents a novel approach for task offloading in an Edge-Cloud system in order to minimize the overall service time for latency-sensitive applications. This approach adopts fuzzy logic algorithms, considering application characteristics (e.g., CPU demand, network demand and delay sensitivity) as well as resource utilization and resource heterogeneity. A number of simulation experiments are conducted to evaluate the proposed approach with other related approaches, where it was found to improve the overall service time for latency-sensitive applications and utilize the edge-cloud resources effectively. Also, the results show that different offloading decisions within the Edge-Cloud system can lead to various service time due to the computational resources and communications types.


Sign in / Sign up

Export Citation Format

Share Document