scholarly journals A Review of Shannon and Differential Entropy Rate Estimation

Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 1046
Author(s):  
Andrew Feutrill ◽  
Matthew Roughan

In this paper, we present a review of Shannon and differential entropy rate estimation techniques. Entropy rate, which measures the average information gain from a stochastic process, is a measure of uncertainty and complexity of a stochastic process. We discuss the estimation of entropy rate from empirical data, and review both parametric and non-parametric techniques. We look at many different assumptions on properties of the processes for parametric processes, in particular focussing on Markov and Gaussian assumptions. Non-parametric estimation relies on limit theorems which involve the entropy rate from observations, and to discuss these, we introduce some theory and the practical implementations of estimators of this type.

2009 ◽  
Vol 20 (2) ◽  
pp. 111-130 ◽  
Author(s):  
Ronaldo Dias ◽  
Nancy L. Garcia ◽  
Angelo Martarelli

2002 ◽  
Vol 335 (2) ◽  
pp. 183-188 ◽  
Author(s):  
Vilijandas Bagdonavičius ◽  
Algis Bikelis ◽  
Vytautas Kazakevičius ◽  
Mikhail Nikulin

Sign in / Sign up

Export Citation Format

Share Document