scholarly journals HYBRID DEEP NEURAL NETWORK ANOMALY DETECTION SYSTEM FOR SCADA NETWORKS

2021 ◽  
Vol 128 (2) ◽  
pp. 141-191
Author(s):  
Raogo Kabore ◽  
Hyacinthe Kouassi Konan ◽  
Adlès Kouassi ◽  
Yvon Kermarrec ◽  
Philippe Lenca ◽  
...  
2021 ◽  
Vol 11 (15) ◽  
pp. 7050
Author(s):  
Zeeshan Ahmad ◽  
Adnan Shahid Khan ◽  
Kashif Nisar ◽  
Iram Haider ◽  
Rosilah Hassan ◽  
...  

The revolutionary idea of the internet of things (IoT) architecture has gained enormous popularity over the last decade, resulting in an exponential growth in the IoT networks, connected devices, and the data processed therein. Since IoT devices generate and exchange sensitive data over the traditional internet, security has become a prime concern due to the generation of zero-day cyberattacks. A network-based intrusion detection system (NIDS) can provide the much-needed efficient security solution to the IoT network by protecting the network entry points through constant network traffic monitoring. Recent NIDS have a high false alarm rate (FAR) in detecting the anomalies, including the novel and zero-day anomalies. This paper proposes an efficient anomaly detection mechanism using mutual information (MI), considering a deep neural network (DNN) for an IoT network. A comparative analysis of different deep-learning models such as DNN, Convolutional Neural Network, Recurrent Neural Network, and its different variants, such as Gated Recurrent Unit and Long Short-term Memory is performed considering the IoT-Botnet 2020 dataset. Experimental results show the improvement of 0.57–2.6% in terms of the model’s accuracy, while at the same time reducing the FAR by 0.23–7.98% to show the effectiveness of the DNN-based NIDS model compared to the well-known deep learning models. It was also observed that using only the 16–35 best numerical features selected using MI instead of 80 features of the dataset result in almost negligible degradation in the model’s performance but helped in decreasing the overall model’s complexity. In addition, the overall accuracy of the DL-based models is further improved by almost 0.99–3.45% in terms of the detection accuracy considering only the top five categorical and numerical features.


2014 ◽  
Vol 2014 ◽  
pp. 1-13 ◽  
Author(s):  
Yuan Liu ◽  
Xiaofeng Wang ◽  
Kaiyu Liu

Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor’s regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly.


2019 ◽  
Vol 8 (1) ◽  
pp. 46-51 ◽  
Author(s):  
Mukrimah Nawir ◽  
Amiza Amir ◽  
Naimah Yaakob ◽  
Ong Bi Lynn

Network anomaly detection system enables to monitor computer network that behaves differently from the network protocol and it is many implemented in various domains. Yet, the problem arises where different application domains have different defining anomalies in their environment. These make a difficulty to choose the best algorithms that suit and fulfill the requirements of certain domains and it is not straightforward. Additionally, the issue of centralization that cause fatal destruction of network system when powerful malicious code injects in the system. Therefore, in this paper we want to conduct experiment using supervised Machine Learning (ML) for network anomaly detection system that low communication cost and network bandwidth minimized by using UNSW-NB15 dataset to compare their performance in term of their accuracy (effective) and processing time (efficient) for a classifier to build a model. Supervised machine learning taking account the important features by labelling it from the datasets. The best machine learning algorithm for network dataset is AODE with a comparable accuracy is 97.26% and time taken approximately 7 seconds. Also, distributed algorithm solves the issue of centralization with the accuracy and processing time still a considerable compared to a centralized algorithm even though a little drop of the accuracy and a bit longer time needed.


Sign in / Sign up

Export Citation Format

Share Document