A lower bound on bayes risk in classification problems

1976 ◽  
Vol 28 (1) ◽  
pp. 385-387 ◽  
Author(s):  
S. N. U. A. Kirmani
2018 ◽  
Vol 38 (2) ◽  
pp. 429-440
Author(s):  
Rafał Wieczorek ◽  
Hanna Podsędkowska

The entropic upper bound for Bayes risk in a general quantum case is presented. We obtained generalization of the entropic lower bound for probability of detection. Our result indicates upper bound for Bayes risk in a particular case of loss function – for probability of detection in a pretty general setting of an arbitrary finite von Neumann algebra. It is also shown under which condition the indicated upper bound is achieved.


2008 ◽  
Vol 20 (11) ◽  
pp. 2792-2838 ◽  
Author(s):  
Masanori Kawakita ◽  
Shinto Eguchi

We propose a local boosting method in classification problems borrowing from an idea of the local likelihood method. Our proposal, local boosting, includes a simple device for localization for computational feasibility. We proved the Bayes risk consistency of the local boosting in the framework of Probably approximately correct learning. Inspection of the proof provides a useful viewpoint for comparing ordinary boosting and local boosting with respect to the estimation error and the approximation error. Both boosting methods have the Bayes risk consistency if their approximation errors decrease to zero. Compared to ordinary boosting, local boosting may perform better by controlling the trade-off between the estimation error and the approximation error. Ordinary boosting with complicated base classifiers or other strong classification methods, including kernel machines, may have classification performance comparable to local boosting with simple base classifiers, for example, decision stumps. Local boosting, however, has an advantage with respect to interpretability. Local boosting with simple base classifiers offers a simple way to specify which features are informative and how their values contribute to a classification rule even though locally. Several numerical studies on real data sets confirm these advantages of local boosting.


Entropy ◽  
2021 ◽  
Vol 23 (2) ◽  
pp. 161
Author(s):  
Ken-ichi Koike ◽  
Shintaro Hashimoto

This paper presents a difference-type lower bound for the Bayes risk as a difference-type extension of the Borovkov–Sakhanenko bound. The resulting bound asymptotically improves the Bobrovsky–Mayor–Wolf–Zakai bound which is difference-type extension of the Van Trees bound. Some examples are also given.


1998 ◽  
Vol 48 (1-2) ◽  
pp. 83-92 ◽  
Author(s):  
Subhash C. Bagui ◽  
Somnath Datta

In this article we study some properties of the Bayes risk with respect to prior probabilities in the context of two class classification problems. We show that Bayes risk is maximized when the ratio of the prior probabilities equals the median of the likelihood ratio statistic under the average density. We also show that when probability density functions (pdf) are symmetric and differ only in locations and the likelihood ratio is monotonic, the Bayes risk has a unique maximum at prior probability , and decreases as the difference between the prior probabilities increases. Several interesting examples are cited to illustrate the results.


Sign in / Sign up

Export Citation Format

Share Document