A practice guide of software aging prediction in a web server based on machine learning

2016 ◽  
Vol 13 (6) ◽  
pp. 225-235 ◽  
Author(s):  
Yongquan Yan ◽  
Ping Guo
PLoS ONE ◽  
2016 ◽  
Vol 11 (8) ◽  
pp. e0155290 ◽  
Author(s):  
Ying Hong Li ◽  
Jing Yu Xu ◽  
Lin Tao ◽  
Xiao Feng Li ◽  
Shuang Li ◽  
...  

2021 ◽  
Vol 13 (6) ◽  
pp. 0-0

Network Proxies and Virtual Private Networks (VPN) are tools that are used every day to facilitate various business functions. However, they have gained popularity amongst unintended userbases as tools that can be used to hide mask identities while using websites and web-services. Anonymising Proxies and/or VPNs act as an intermediary between a user and a web server with a Proxy and/or VPN IP address taking the place of the user’s IP address that is forwarded to the web server. This paper presents computational models based on intelligent machine learning techniques to address the limitations currently experienced by unauthorised user detection systems. A model to detect usage of anonymising proxies was developed using a Multi-layered perceptron neural network that was trained using data found in the Transmission Control Protocol (TCP) header of captured network packets


2021 ◽  
Author(s):  
Ramon Abilio ◽  
Cristiano Garcia ◽  
Victor Fernandes

Browsing on Internet is part of the world population’s daily routine. The number of web pages is increasing and so is the amount of published content (news, tutorials, images, videos) provided by them. Search engines use web robots to index web contents and to offer better results to their users. However, web robots have also been used for exploiting vulnerabilities in web pages. Thus, monitoring and detecting web robots’ accesses is important in order to keep the web server as safe as possible. Data Mining methods have been applied to web server logs (used as data source) in order to detect web robots. Then, the main objective of this work was to observe evidences of definition or use of web robots detection by analyzing web server-side logs using Data Mining methods. Thus, we conducted a systematic Literature mapping, analyzing papers published between 2013 and 2020. In the systematic mapping, we analyzed 34 studies and they allowed us to better understand the area of web robots detection, mapping what is being done, the data used to perform web robots detection, the tools, and algorithms used in the Literature. From those studies, we extracted 33 machine learning algorithms, 64 features, and 13 tools. This study is helpful for researchers to find machine learning algorithms, features, and tools to detect web robots by analyzing web server logs.


2020 ◽  
Vol 12 (1) ◽  
pp. 12 ◽  
Author(s):  
You Guo ◽  
Hector Marco-Gisbert ◽  
Paul Keir

A webshell is a command execution environment in the form of web pages. It is often used by attackers as a backdoor tool for web server operations. Accurately detecting webshells is of great significance to web server protection. Most security products detect webshells based on feature-matching methods—matching input scripts against pre-built malicious code collections. The feature-matching method has a low detection rate for obfuscated webshells. However, with the help of machine learning algorithms, webshells can be detected more efficiently and accurately. In this paper, we propose a new PHP webshell detection model, the NB-Opcode (naïve Bayes and opcode sequence) model, which is a combination of naïve Bayes classifiers and opcode sequences. Through experiments and analysis on a large number of samples, the experimental results show that the proposed method could effectively detect a range of webshells. Compared with the traditional webshell detection methods, this method improves the efficiency and accuracy of webshell detection.


Sign in / Sign up

Export Citation Format

Share Document