scholarly journals Information-Theoretic Bound on the Entropy Production to Maintain a Classical Nonequilibrium Distribution Using Ancillary Control

Entropy ◽  
2017 ◽  
Vol 19 (7) ◽  
pp. 333 ◽  
Author(s):  
◽  
1977 ◽  
Vol 55 (9) ◽  
pp. 1588-1591 ◽  
Author(s):  
Andrew W. Yau ◽  
Huw O. Pritchard

Four theorems relating to complete monotonicity of entropy production in isothermal bulk relaxation have been proved, and in another two cases, it is shown by numerical experiment that the entropy production is probably monotonic also. The total entropy and the total energy do not in general relax with the same rate constant, and under many conditions the time constant for entropy relaxation approaches twice the time constant for energy relaxation. We speculate that the maximum entropy principle used in the information-theoretic approach to bulk relaxation may be synonymous with the principle of complete monotonicity of entropy production.


Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Sign in / Sign up

Export Citation Format

Share Document