hartley entropy
Recently Published Documents


TOTAL DOCUMENTS

5
(FIVE YEARS 0)

H-INDEX

2
(FIVE YEARS 0)

Entropy ◽  
2019 ◽  
Vol 21 (2) ◽  
pp. 163 ◽  
Author(s):  
Qian Pan ◽  
Deyun Zhou ◽  
Yongchuan Tang ◽  
Xiaoyang Li ◽  
Jichuan Huang

Dempster-Shafer evidence theory (DST) has shown its great advantages to tackle uncertainty in a wide variety of applications. However, how to quantify the information-based uncertainty of basic probability assignment (BPA) with belief entropy in DST framework is still an open issue. The main work of this study is to define a new belief entropy for measuring uncertainty of BPA. The proposed belief entropy has two components. The first component is based on the summation of the probability mass function (PMF) of single events contained in each BPA, which are obtained using plausibility transformation. The second component is the same as the weighted Hartley entropy. The two components could effectively measure the discord uncertainty and non-specificity uncertainty found in DST framework, respectively. The proposed belief entropy is proved to satisfy the majority of the desired properties for an uncertainty measure in DST framework. In addition, when BPA is probability distribution, the proposed method could degrade to Shannon entropy. The feasibility and superiority of the new belief entropy is verified according to the results of numerical experiments.


2016 ◽  
Author(s):  
Zuzana Krbcová ◽  
Jaromír Kukal ◽  
Jan Svihlik ◽  
Karel Fliegel

2016 ◽  
Vol 27 (9) ◽  
pp. 1983-1990
Author(s):  
Amit Chattopadhyay ◽  
Suviseshamuthu Easter Selvan ◽  
Umberto Amato

2004 ◽  
Vol 11 (03) ◽  
pp. 257-266 ◽  
Author(s):  
B. H. Lavenda

PAE cannot be made a basis for either a generalized statistical mechanics or a generalized information theory. Either statistical independence must be waived, or the expression of the averaged conditional probability as the difference between the marginal and joint entropies must be relinquished. The same inequality, relating the PAE to the Rényi entropy, when applied to the mean code length produces an expression that is without bound as the order of the code length approaches infinity. Since the mean code length associated with the Rényi entropy is finite and can be made to come as close to the Hartley entropy as desired in the same limit, the PAE have a more limited range of validity than the Rényi entropy which they approximate.


Sign in / Sign up

Export Citation Format

Share Document