Kernel-Predictability: A New Information Measure and Its Application to Image Registration

Author(s):  
Héctor Fernando Gómez-García ◽  
José L. Marroquín ◽  
Johan Van Horebeek
2009 ◽  
Vol 2009 ◽  
pp. 1-13
Author(s):  
Fritz Wysotzki ◽  
Peter Geibel

This article describes how the costs of misclassification given with the individual training objects for classification learning can be used in the construction of decision trees for minimal cost instead of minimal error class decisions. This is demonstrated by defining modified, cost-dependent probabilities, a new, cost-dependent information measure, and using a cost-sensitive extension of the CAL5 algorithm for learning decision trees. The cost-dependent information measure ensures the selection of the (local) next best, that is, cost-minimizing, discriminating attribute in the sequential construction of the classification trees. This is shown to be a cost-dependent generalization of the classical information measure introduced by Shannon, which only depends on classical probabilities. It is therefore of general importance and extends classic information theory, knowledge processing, and cognitive science, since subjective evaluations of decision alternatives can be included in entropy and the transferred information. Decision trees can then be viewed as cost-minimizing decoders for class symbols emitted by a source and coded by feature vectors. Experiments with two artificial datasets and one application example show that this approach is more accurate than a method which uses class dependent costs given by experts a priori.


2015 ◽  
Vol 60 (22) ◽  
pp. 8767-8790 ◽  
Author(s):  
Bicao Li ◽  
Guanyu Yang ◽  
Jean Louis Coatrieux ◽  
Baosheng Li ◽  
Huazhong Shu

2015 ◽  
Vol 2015 ◽  
pp. 1-8 ◽  
Author(s):  
Madan Mohan Sati ◽  
Nitin Gupta

We propose a generalized cumulative residual information measure based on Tsallis entropy and its dynamic version. We study the characterizations of the proposed information measure and define new classes of life distributions based on this measure. Some applications are provided in relation to weighted and equilibrium probability models. Finally the empirical cumulative Tsallis entropy is proposed to estimate the new information measure.


2012 ◽  
Vol 03 (02) ◽  
pp. 175-178 ◽  
Author(s):  
Witold Kosiński ◽  
Paweł Michalak ◽  
Piotr Gut

2009 ◽  
Vol 40 (1) ◽  
pp. 41-58 ◽  
Author(s):  
Satish Kumar

n the present communication, I have defined the new information measure called '' $ \alpha $-R-norm information measure ''. It has been characterized using infimum oiperation in Section 2 and axiomatically in Section 3. Its properties have been studied in Section 4, joint and conditional $ \alpha $-R-norm information measure are studied in Section 5.


2017 ◽  
Vol 9 (4) ◽  
pp. 16-36 ◽  
Author(s):  
Souad Taleb Zouggar ◽  
Abdelkader Adla

To compute a partition quality for a decision tree, we propose a new measure called NIM “New Information Measure”. The measure is simpler, provides similar performance, and sometimes outperforms the existing measures used with tree-based methods. The experimental results using the MONITDIAB application (Taleb & Atmani, 2013) and datasets from the UCI repository (Asuncion & Newman, 2007) confirm the classification capabilities of our proposal in comparison to the Shannon measure used with ID3 and C4.5 decision tree methods.


Sign in / Sign up

Export Citation Format

Share Document