scholarly journals A Novel Class Selection Method Based on a Partial Kullback-Leibler Information Measure and Its Application to Motion Classification Problem for EMG Signal Discrimination

2012 ◽  
Vol 48 (9) ◽  
pp. 580-588 ◽  
Author(s):  
Taro SHIBANOKI ◽  
Keisuke SHIMA ◽  
Takeshi TAKAKI ◽  
Yuichi KURITA ◽  
Akira OTSUKA ◽  
...  
Symmetry ◽  
2021 ◽  
Vol 13 (7) ◽  
pp. 1290
Author(s):  
Le Wang ◽  
Yuelin Gao ◽  
Shanshan Gao ◽  
Xin Yong

In solving classification problems in the field of machine learning and pattern recognition, the pre-processing of data is particularly important. The processing of high-dimensional feature datasets increases the time and space complexity of computer processing and reduces the accuracy of classification models. Hence, the proposal of a good feature selection method is essential. This paper presents a new algorithm for solving feature selection, retaining the selection and mutation operators from traditional genetic algorithms. On the one hand, the global search capability of the algorithm is ensured by changing the population size, on the other hand, finding the optimal mutation probability for solving the feature selection problem based on different population sizes. During the iteration of the algorithm, the population size does not change, no matter how many transformations are made, and is the same as the initialized population size; this spatial invariance is physically defined as symmetry. The proposed method is compared with other algorithms and validated on different datasets. The experimental results show good performance of the algorithm, in addition to which we apply the algorithm to a practical Android software classification problem and the results also show the superiority of the algorithm.


Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 298
Author(s):  
Sangun Park

It is well-known that some information measures, including Fisher information and entropy, can be represented in terms of the hazard function. In this paper, we provide the representations of more information measures, including quantal Fisher information and quantal Kullback-leibler information, in terms of the hazard function and reverse hazard function. We provide some estimators of the quantal KL information, which include the Anderson-Darling test statistic, and compare their performances.


Sign in / Sign up

Export Citation Format

Share Document