scholarly journals Semi-Supervised Classification via Hypergraph Convolutional Extreme Learning Machine

2021 ◽  
Vol 11 (9) ◽  
pp. 3867
Author(s):  
Zhewei Liu ◽  
Zijia Zhang ◽  
Yaoming Cai ◽  
Yilin Miao ◽  
Zhikun Chen

Extreme Learning Machine (ELM) is characterized by simplicity, generalization ability, and computational efficiency. However, previous ELMs fail to consider the inherent high-order relationship among data points, resulting in being powerless on structured data and poor robustness on noise data. This paper presents a novel semi-supervised ELM, termed Hypergraph Convolutional ELM (HGCELM), based on using hypergraph convolution to extend ELM into the non-Euclidean domain. The method inherits all the advantages from ELM, and consists of a random hypergraph convolutional layer followed by a hypergraph convolutional regression layer, enabling it to model complex intraclass variations. We show that the traditional ELM is a special case of the HGCELM model in the regular Euclidean domain. Extensive experimental results show that HGCELM remarkably outperforms eight competitive methods on 26 classification benchmarks.

2020 ◽  
Vol 37 (6) ◽  
pp. 1003-1008
Author(s):  
Lei Yu ◽  
Binglin Zhang ◽  
Rui Li

In traffic image target detection, unusual targets like a running dog has not been paid sufficient attention. The mature detection methods for general targets cannot be directly applied to detect unusual targets, owing to their high complexity, poor feature expression ability, and requirement for numerous manual labels. To effectively detect unusual targets in traffic images, this paper proposes a multi-level semi-supervised one-class extreme learning machine (ML-S2OCELM). Specifically, the extreme learning machine (ELM) was chosen as the basis to develop a classifier, whose variables could be calculated directly at the cost of limited computing resources. The hypergraph Laplacian array was employed to improve the depiction of data smoothness, making semi-supervised classification more accurate. Furthermore, a stack auto-encoder (AE) was introduced to implement a multi-level neural network (NN), which can extract discriminative eigenvectors with suitable dimensions. Experiments show that the proposed method can efficiently screen out traffic images with unusual targets with only a few positive labels. The research results provide a time-efficient, and resource-saving instrument for feature expression and target detection.


Author(s):  
Haoyu Niu ◽  
Yuquan Chen ◽  
YangQuan Chen

Abstract Extreme Learning Machine (ELM) has a powerful capability to approximate the regression and classification problems for a lot of data. ELM does not need to learn parameters in hidden neurons, which enables ELM to learn a thousand times faster than conventional popular learning algorithms. Since the parameters in the hidden layers are randomly generated, what is the optimal randomness? Lévy distribution, a heavy-tailed distribution, has been shown to be the optimal randomness in an unknown environment for finding some targets. Thus, Lévy distribution is used to generate the parameters in the hidden layers (more likely to reach the optimal parameters) and better computational results are then derived. Since Lévy distribution is a special case of Mittag-Leffler distribution, in this paper, the Mittag-Leffler distribution is used in order to get better performance. We show the procedure of generating the Mittag-Leffler distribution and then the training algorithm using Mittag-Leffler distribution is given. The experimental result shows that the Mittag-Leffler distribution performs similarly as the Lévy distribution, both can reach better performance than the conventional method. Some detailed discussions are finally presented to explain the experimental results.


2020 ◽  
Vol 52 (3) ◽  
pp. 1723-1744
Author(s):  
Li Li ◽  
Kaiyi Zhao ◽  
Sicong Li ◽  
Ruizhi Sun ◽  
Saihua Cai

Sign in / Sign up

Export Citation Format

Share Document