scholarly journals Improved Mitigation of Cyber Threats in IIoT for Smart Cities: A New-Era Approach and Scheme

Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 1976
Author(s):  
Semi Park ◽  
Kyungho Lee

Cybersecurity in Industrial Internet of Things (IIoT) has become critical as smart cities are becoming increasingly linked to industrial control systems (ICSs) used in critical infrastructure. Consequently, data-driven security systems for analyzing massive amounts of data generated by smart cities have become essential. A representative method for analyzing large-scale data is the game bot detection approach used in massively multiplayer online role-playing games. We reviewed the literature on bot detection methods to extend the anomaly detection approaches used in bot detection schemes to IIoT fields. Finally, we proposed a process wherein the data envelopment analysis (DEA) model was applied to identify features for efficiently detecting anomalous behavior in smart cities. Experimental results using random forest show that our extracted features based on a game bot can achieve an average F1-score of 0.99903 using 10-fold validation. We confirmed the applicability of the analyzed game-industry methodology to other fields and trained a random forest on the high-efficiency features identified by applying a DEA, obtaining an F1-score of 0.997 using the validation set approach. In this study, an anomaly detection method for analyzing massive smart city data based on a game industry methodology was presented and applied to the ICS dataset.

Author(s):  
Zhijun Zhao ◽  
Chen Xu ◽  
Bo Li

AbstractSecurity devices produce huge number of logs which are far beyond the processing speed of human beings. This paper introduces an unsupervised approach to detecting anomalous behavior in large scale security logs. We propose a novel feature extracting mechanism and could precisely characterize the features of malicious behaviors. We design a LSTM-based anomaly detection approach and could successfully identify attacks on two widely-used datasets. Our approach outperforms three popular anomaly detection algorithms, one-class SVM, GMM and Principal Components Analysis, in terms of accuracy and efficiency.


2019 ◽  
Vol 11 (21) ◽  
pp. 2537 ◽  
Author(s):  
Dandan Ma ◽  
Yuan Yuan ◽  
Qi Wang

A hyperspectral image usually covers a large scale of ground scene, which contains various materials with different spectral properties. When directly exploring the background information using all the image pixels, complex spectral interactions and inter-/intra-difference of different samples will significantly reduce the accuracy of background evaluation and further affect the detection performance. To address this problem, this paper proposes a novel hyperspectral anomaly detection method based on separability-aware sample cascade model. Through identifying separability of hyperspectral pixels, background samples are sifted out layer-by-layer according to their separable degrees from anomalies, which can ensure the accuracy and distinctiveness of background representation. First, as spatial structure is beneficial for recognizing target, a new spectral–spatial feature extraction technique is used in this work based on the PCA technique and edge-preserving filtering. Second, depending on different separability computed by sparse representation, samples are separated into different sets which can effectively and completely reflect various characteristics of background across all the cascade layers. Meanwhile, some potential abnormal targets are removed at each selection step to avoid their effects on subsequent layers. Finally, comprehensively taking different good properties of all the separability-aware layers into consideration, a simple multilayer anomaly detection strategy is adopted to obtain the final detection map. Extensive experimental results on five real-world hyperspectral images demonstrate our method’s superior performance. Compared with seven representative anomaly detection methods, our method improves the average detection accuracy with great advantages.


2017 ◽  
Vol 65 (4) ◽  
Author(s):  
Marco F. Huber

AbstractDetecting early enough the anomalous behavior of technical systems facilitates cost savings thanks to avoiding system downtimes, guiding maintenance, or improving performance. The novel framework proposed in this paper processes event streams originating from system monitoring for anomaly detection purposes. Therefore, statistical models characterizing the normal behavior of the monitored system are learned from the events. Instead of having one coarse normal model for all operational states, the proposed framework contains a mechanism for automatically detecting different conditions of the system allowing for fine-tuned models for every condition. The performance of the framework is demonstrated by means of a real-world application, where the log files of a large-scale printing machine are analyzed for anomalies.


2021 ◽  
Vol 3 (1) ◽  
Author(s):  
Ibrahim Muzaferija ◽  
Zerina Mašetić ◽  

While leveraging cloud computing for large-scale distributed applications allows seamless scaling, many companies struggle following up with the amount of data generated in terms of efficient processing and anomaly detection, which is a necessary part of the management of modern applications. As the record of user behavior, weblogs surely become the research item related to anomaly detection. Many anomaly detection methods based on automated log analysis have been proposed. However, not in the context of big data applications where anomalous behavior needs to be detected in understanding phases prior to modeling a system for such use. Big Data Analytics often ignores anomalous point due to high volume of data. To address this problem, we propose a complemented methodology for Big Data Analytics – the Exploratory Data Analysis, which assists in gaining insight into data relationships without the classical hypothesis modeling. In that way, we can gain better understanding of the patterns and spot anomalies. Results show that Exploratory Data Analysis facilitates anomaly detection and the CRISP-DM Business Understanding phase, making it one of the key steps in the Data Understanding phase.


2021 ◽  
Vol 2021 ◽  
pp. 1-6
Author(s):  
Yihan Bian ◽  
Xinchen Tang

With the rapid growth of video surveillance data, there is an increasing demand for big data automatic anomaly detection of large-scale video data. The detection methods using reconstruction errors based on deep autoencoders have been widely discussed. However, sometimes the autoencoder could reconstruct the anomaly well and lead to missing detections. In order to solve this problem, this paper uses a memory module to enhance the autoencoder, which is called the memory-augmented autoencoder (Memory AE) method. Given the input, Memory AE first obtains the code from the encoder and then uses it as a query to retrieve the most relevant memory items for reconstruction. In the training phase, the memory content is updated and encouraged to represent prototype elements of normal data. In the test phase, the learned memory elements are fixed, and reconstruction is obtained from several selected memory records of normal data. So, the reconstruction will tend to be close to normal samples. Therefore, the reconstruction of abnormal errors will be strengthened for abnormal detection. The experimental results on two public video anomaly detection datasets, i.e., Avenue dataset and ShanghaiTech dataset, prove the effectiveness of the proposed method.


Author(s):  
Yang Yuan ◽  
Eun Kyung Lee ◽  
Dario Pompili ◽  
Junbi Liao

The high density of servers in datacenters generates a large amount of heat, resulting in the high possibility of thermally anomalous events, i.e. computer room air conditioner fan failure, server fan failure, and workload misconfiguration. As such anomalous events increase the cost of maintaining computing and cooling components, they need to be detected, localized, and classified for taking appropriate remedial actions. In this article, a hierarchical neural network framework is proposed to detect small- (server level) and large-scale (datacenter level) thermal anomalies. This novel framework, which is organized into two tiers, analyzes the data sensed by heterogeneous sensors such as sensors built in the servers and external sensors (Telosb). The proposed solution employs a neural network to learn about (a) the relationship among sensing values (i.e. internal, external, and fan speed) and (b) the relationship between the sensing values and workload information. Then, the bottom tier of our framework detects thermal anomalies, whereas the top tier localizes and classifies them. Our solution outperforms other anomaly-detection methods based on regression model, support vector machine, and self-organizing map, as shown by the experimental results.


2018 ◽  
Author(s):  
Matthias May ◽  
Kira Rehfeld

Greenhouse gas emissions must be cut to limit global warming to 1.5-2C above preindustrial levels. Yet the rate of decarbonisation is currently too low to achieve this. Policy-relevant scenarios therefore rely on the permanent removal of CO<sub>2</sub> from the atmosphere. However, none of the envisaged technologies has demonstrated scalability to the decarbonization targets for the year 2050. In this analysis, we show that artificial photosynthesis for CO<sub>2</sub> reduction may deliver an efficient large-scale carbon sink. This technology is mainly developed towards solar fuels and its potential for negative emissions has been largely overlooked. With high efficiency and low sensitivity to high temperature and illumination conditions, it could, if developed towards a mature technology, present a viable approach to fill the gap in the negative emissions budget.<br>


2018 ◽  
Author(s):  
Matthias May ◽  
Kira Rehfeld

Greenhouse gas emissions must be cut to limit global warming to 1.5-2C above preindustrial levels. Yet the rate of decarbonisation is currently too low to achieve this. Policy-relevant scenarios therefore rely on the permanent removal of CO<sub>2</sub> from the atmosphere. However, none of the envisaged technologies has demonstrated scalability to the decarbonization targets for the year 2050. In this analysis, we show that artificial photosynthesis for CO<sub>2</sub> reduction may deliver an efficient large-scale carbon sink. This technology is mainly developed towards solar fuels and its potential for negative emissions has been largely overlooked. With high efficiency and low sensitivity to high temperature and illumination conditions, it could, if developed towards a mature technology, present a viable approach to fill the gap in the negative emissions budget.<br>


Sign in / Sign up

Export Citation Format

Share Document