scholarly journals Social Network Spam Detection Based on ALBERT and Combination of Bi-LSTM with Self-Attention

2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Guangxia Xu ◽  
Daiqi Zhou ◽  
Jun Liu

Social networks are full of spams and spammers. Although social network platforms have established a variety of strategies to prevent the spread of spam, strict information review mechanism has given birth to smarter spammers who disguise spam as text sent by ordinary users. In response to this, this paper proposes a spam detection method powered by the self-attention Bi-LSTM neural network model combined with ALBERT, a lightweight word vector model of BERT. We take advantage of ALBERT to transform social network text into word vectors and then input them to the Bi-LSTM layer. After feature extraction and combined with the information focus of the self-attention layer, the final feature vector is obtained. Finally, SoftMax classifier performs classification to obtain the result. We verify the excellence of the model with accuracy, precision, F1-score, etc. The results show that the model has better performance than others.

2020 ◽  
Vol 65 (1) ◽  
pp. 355-367
Author(s):  
Ye Wang ◽  
Bixin Liu ◽  
Hongjia Wu ◽  
Shan Zhao ◽  
Zhiping Cai ◽  
...  

2020 ◽  
Vol 2020 ◽  
pp. 1-9
Author(s):  
Jun Zhao ◽  
Xumei Chen

An intelligent evaluation method is presented to analyze the competitiveness of airlines. From the perspective of safety, service, and normality, we establish the competitiveness indexes of traffic rights and the standard sample base. The self-organizing mapping (SOM) neural network is utilized to self-organize and self-learn the samples in the state of no supervision and prior knowledge. The training steps of high convergence speed and high clustering accuracy are determined based on the multistep setting. The typical airlines index data are utilized to verify the effect of the self-organizing mapping neural network on the airline competitiveness analysis. The simulation results show that the self-organizing mapping neural network can accurately and effectively classify and evaluate the competitiveness of airlines, and the results have important reference value for the allocation of traffic rights resources.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Muhammad Aqeel Aslam ◽  
Cuili Xue ◽  
Yunsheng Chen ◽  
Amin Zhang ◽  
Manhua Liu ◽  
...  

AbstractDeep learning is an emerging tool, which is regularly used for disease diagnosis in the medical field. A new research direction has been developed for the detection of early-stage gastric cancer. The computer-aided diagnosis (CAD) systems reduce the mortality rate due to their effectiveness. In this study, we proposed a new method for feature extraction using a stacked sparse autoencoder to extract the discriminative features from the unlabeled data of breath samples. A Softmax classifier was then integrated to the proposed method of feature extraction, to classify gastric cancer from the breath samples. Precisely, we identified fifty peaks in each spectrum to distinguish the EGC, AGC, and healthy persons. This CAD system reduces the distance between the input and output by learning the features and preserve the structure of the input data set of breath samples. The features were extracted from the unlabeled data of the breath samples. After the completion of unsupervised training, autoencoders with Softmax classifier were cascaded to develop a deep stacked sparse autoencoder neural network. In last, fine-tuning of the developed neural network was carried out with labeled training data to make the model more reliable and repeatable. The proposed deep stacked sparse autoencoder neural network architecture exhibits excellent results, with an overall accuracy of 98.7% for advanced gastric cancer classification and 97.3% for early gastric cancer detection using breath analysis. Moreover, the developed model produces an excellent result for recall, precision, and f score value, making it suitable for clinical application.


2020 ◽  
Vol 53 (2) ◽  
pp. 15374-15379
Author(s):  
Hu He ◽  
Xiaoyong Zhang ◽  
Fu Jiang ◽  
Chenglong Wang ◽  
Yingze Yang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document