scholarly journals Feature Selection Using Intensified Tabu Search for Supervised Classification

Tabu Search ◽  
10.5772/5590 ◽  
2008 ◽  
Author(s):  
Muhammad Atif ◽  
Jim Smith
2015 ◽  
Vol 1 (311) ◽  
Author(s):  
Katarzyna Stąpor

Discriminant Analysis can best be defined as a technique which allows the classification of an individual into several dictinctive populations on the basis of a set of measurements. Stepwise discriminant analysis (SDA) is concerned with selecting the most important variables whilst retaining the highest discrimination power possible. The process of selecting a smaller number of variables is often necessary for a variety number of reasons. In the existing statistical software packages SDA is based on the classic feature selection methods. Many problems with such stepwise procedures have been identified. In this work the new method based on the metaheuristic strategy tabu search will be presented together with the experimental results conducted on the selected benchmark datasets. The results are promising.


Author(s):  
ALEXSEY LIAS-RODRÍGUEZ ◽  
GUILLERMO SANCHEZ-DIAZ

Typical testors are useful tools for feature selection and for determining feature relevance in supervised classification problems. Nowadays, computing all typical testors of a training matrix is very expensive; all reported algorithms have exponential complexity depending on the number of columns in the matrix. In this paper, we introduce the faster algorithm BR (Boolean Recursive), called fast-BR algorithm, that is based on elimination of gaps and reduction of columns. Fast-BR algorithm is designed to generate all typical testors from a training matrix, requiring a reduced number of operations. Experimental results using this fast implementation and the comparison with other state-of-the-art related algorithms that generate typical testors are presented.


2008 ◽  
Vol 41 (12) ◽  
pp. 3706-3719 ◽  
Author(s):  
Wing W.Y. Ng ◽  
Daniel S. Yeung ◽  
Michael Firth ◽  
Eric C.C. Tsang ◽  
Xi-Zhao Wang

2007 ◽  
Vol 24 (1) ◽  
pp. 110-117 ◽  
Author(s):  
M. Draminski ◽  
A. Rada-Iglesias ◽  
S. Enroth ◽  
C. Wadelius ◽  
J. Koronacki ◽  
...  

2021 ◽  
Vol 71 ◽  
pp. 11-20
Author(s):  
Michel Barlaud ◽  
Marc Antonini

This paper deals with supervised classification and feature selection with application in the context of high dimensional features. A classical approach leads to an optimization problem minimizing the within sum of squares in the clusters (I2 norm) with an I1 penalty in order to promote sparsity. It has been known for decades that I1 norm is more robust than I2 norm to outliers. In this paper, we deal with this issue using a new proximal splitting method for the minimization of a criterion using I2 norm both for the constraint and the loss function. Since the I1 criterion is only convex and not gradient Lipschitz, we advocate the use of a Douglas-Rachford minimization solution. We take advantage of the particular form of the cost and, using a change of variable, we provide a new efficient tailored primal Douglas-Rachford splitting algorithm which is very effective on high dimensional dataset. We also provide an efficient classifier in the projected space based on medoid modeling. Experiments on two biological datasets and a computer vision dataset show that our method significantly improves the results compared to those obtained using a quadratic loss function.


Sign in / Sign up

Export Citation Format

Share Document