absolute entropy
Recently Published Documents


TOTAL DOCUMENTS

47
(FIVE YEARS 1)

H-INDEX

13
(FIVE YEARS 0)

2020 ◽  
Vol 526 ◽  
pp. 112815
Author(s):  
Xue Yan ◽  
Tian Lan ◽  
Qingzhu Jia ◽  
Fangyou Yan ◽  
Qiang Wang

Author(s):  
Constantin Bratianu

AbstractThe purpose of this paper is to present the evolution of the concept of entropy from engineering to knowledge management, going through information theory, linguistic entropy, and economic entropy. The concept of entropy was introduced by Rudolf Clausius in thermodynamics in 1865 as a measure of heat transfer between two solid bodies which have different temperatures. As a natural phenomenon, heat flows from the body with a higher temperature toward the body with a lower temperature. However, Rudolf Clausius defined only the change in entropy of the system and not its absolute entropy. Ludwig Boltzmann defined later the absolute entropy by studying the gas molecules behavior in a thermal field. The computational formula defined by Boltzmann relates the microstates of a thermal system with its macrostates. The more uniform the probability distribution of the microstates is the higher the entropy is. The second law of thermodynamics says that in open systems, when there is no intervention from outside, the entropy of the system increases continuously. The concept of entropy proved to be very powerful, fact for which many researchers tried to extend its semantic area and the application domain. In 1948, Claude E. Shannon introduced the concept of information entropy, having the same computational formula as that defined by Boltzmann, but with a different interpretation. This concept solved many engineering communications problems and is used extensively in information theory. Nicholas Georgescu-Roegen used the concept of entropy and the second law of thermodynamics in economics and business. Today, many researchers in economics use the concept of entropy for analyzing different phenomena. The present paper explores the possibility of using the concept of knowledge entropy in knowledge management.


Entropy ◽  
2019 ◽  
Vol 21 (9) ◽  
pp. 825
Author(s):  
Martin Tunnicliffe ◽  
Gordon Hunter

While Shannon’s differential entropy adequately quantifies a dimensioned random variable’s information deficit under a given measurement system, the same cannot be said of differential weighted entropy in its existing formulation. We develop weighted and residual weighted entropies of a dimensioned quantity from their discrete summation origins, exploring the relationship between their absolute and differential forms, and thus derive a “differentialized” absolute entropy based on a chosen “working granularity” consistent with Buckingham’s Π -theorem. We apply this formulation to three common continuous distributions: exponential, Gaussian, and gamma and consider policies for optimizing the working granularity.


2017 ◽  
Vol 1 (1) ◽  
pp. 20-25
Author(s):  
Vladimir N. Emel’yanenko ◽  
◽  
Vladimir V. Turovtsev, ◽  
Yulia A. Fedina ◽  
◽  
...  

Sign in / Sign up

Export Citation Format

Share Document