optimal bayesian classifier
Recently Published Documents


TOTAL DOCUMENTS

4
(FIVE YEARS 0)

H-INDEX

2
(FIVE YEARS 0)

2001 ◽  
Vol 10 (01n02) ◽  
pp. 157-179
Author(s):  
MARK D. HAPPEL ◽  
PETER BOCK

The design of an optimal Bayesian classifier for multiple features is dependent on the estimation of multidimensional joint probability density functions and therefore requires a design sample size that increases exponentially with the number of dimensions. A method was developed that combines classification decisions from marginal density functions using an additional classifier. Unlike voting methods, this method can select a more appropriate class than the ones selected by the marginal classifiers, thus "overriding" their decisions. It is shown that this method always exhibits an asymptotic probability of error no worse than the probability of error of the best marginal classifier.


1992 ◽  
Vol 03 (03) ◽  
pp. 219-235 ◽  
Author(s):  
JAKE REYNOLDS ◽  
LIONEL TARASSENKO

Neural networks have recently been applied to real-world speech recognition problems with a great deal of success. This paper develops a strategy for optimising a neural network known as the Radial Basis Function classifier (RBF) on a large spoken letter recognition problem designed by British Telecom Research Laboratories. The strategy developed can be viewed as a compromise between a fully adaptive approach involving prohibitively large amounts of computation and a heuristic approach resulting in poor generalisation. A value for the optimal number of kernel functions is suggested and methods for determining the positions of the centres and the values of the kernel function widths are provided. During the evolution of the optimisation strategy, it was demonstrated that spatial organisation of the centres does not adversely affect the ability of the classifier to generalise. An RBF employing the optimisation strategy achieved a lower error rate than Woodland’s multilayer perceptron26 and two traditional static pattern classifiers on the same problem. The error rate of the RBF was very close to the estimated minimum error rate obtainable with an optimal Bayesian classifier. An examination of the computational requirements of the classifiers illustrated a significant trade-off between the computational investment in training and level of generalisation achieved.


Sign in / Sign up

Export Citation Format

Share Document