scholarly journals On simple one-class classification methods

Author(s):  
Zineb Noumir ◽  
Paul Honeine ◽  
Cedue Richard
2007 ◽  
Vol Volume 6, april 2007, joint... ◽  
Author(s):  
Oleksiy Mazhelis

International audience One-class classifiers employing for training only the data from one class are justified when the data from other classes is difficult to obtain. In particular, their use is justified in mobile-masquerader detection, where user characteristics are classified as belonging to the legitimate user class or to the impostor class, and where collecting the data originated from impostors is problematic. This paper systematically reviews various one-class classification methods, and analyses their suitability in the context of mobile-masquerader detection. For each classification method, its sensitivity to the errors in the training set, computational requirements, and other characteristics are considered. After that, for each category of features used in masquerader detection, suitable classifiers are identified.


Author(s):  
SANTI SEGUÍ ◽  
LAURA IGUAL ◽  
JORDI VITRIÀ

The problem of training classifiers only with target data arises in many applications where nontarget data are too costly, difficult to obtain, or not available at all. Several one-class classification methods have been presented to solve this problem, but most of the methods are highly sensitive to the presence of outliers in the target class. Ensemble methods have therefore been proposed as a powerful way to improve the classification performance of binary/multi-class learning algorithms by introducing diversity into classifiers. However, their application to one-class classification has been rather limited. In this paper, we present a new ensemble method based on a nonparametric weighted bagging strategy for one-class classification, to improve accuracy in the presence of outliers. While the standard bagging strategy assumes a uniform data distribution, the method we propose here estimates a probability density based on a forest structure of the data. This assumption allows the estimation of data distribution from the computation of simple univariate and bivariate kernel densities. Experiments using original and noisy versions of 20 different datasets show that bagging ensemble methods applied to different one-class classifiers outperform base one-class classification methods. Moreover, we show that, in noisy versions of the datasets, the nonparametric weighted bagging strategy we propose outperforms the classical bagging strategy in a statistically significant way.


Sign in / Sign up

Export Citation Format

Share Document