entropic measure
Recently Published Documents


TOTAL DOCUMENTS

38
(FIVE YEARS 5)

H-INDEX

11
(FIVE YEARS 0)

Entropy ◽  
2021 ◽  
Vol 23 (9) ◽  
pp. 1098
Author(s):  
Yusuke Shibasaki ◽  
Minoru Saito

In this study, we theoretically investigated a generalized stochastic Loewner evolution (SLE) driven by reversible Langevin dynamics in the context of non-equilibrium statistical mechanics. Using the ability of Loewner evolution, which enables encoding of non-equilibrium systems into equilibrium systems, we formulated the encoding mechanism of the SLE by Gibbs entropy-based information-theoretic approaches to discuss its advantages as a means to better describe non-equilibrium systems. After deriving entropy production and flux for the 2D trajectories of the generalized SLE curves, we reformulated the system’s entropic properties in terms of the Kullback–Leibler (KL) divergence. We demonstrate that this operation leads to alternative expressions of the Jarzynski equality and the second law of thermodynamics, which are consistent with the previously suggested theory of information thermodynamics. The irreversibility of the 2D trajectories is similarly discussed by decomposing the entropy into additive and non-additive parts. We numerically verified the non-equilibrium property of our model by simulating the long-time behavior of the entropic measure suggested by our formulation, referred to as the relative Loewner entropy.


2021 ◽  
pp. 1-54
Author(s):  
Richard Sproat ◽  
Alexander Gutkin

Abstract Taxonomies of writing systems since Gelb (1952) have classified systems based on what the written symbols represent: if they represent words or morphemes, they are logographic; if syllables, syllabic; if segments, alphabetic; etc. Sproat (2000) and Rogers (2005) broke with tradition by splitting the logographic and phonographic aspects into two dimensions, with logography being graded rather than a categorical distinction. A system could be syllabic, and highly logographic; or alphabetic, and mostly non-logographic. This accords better with how writing systems actually work, but neither author proposed a method for measuring logography. In this article we propose a novel measure of the degree of logography that uses an attention based sequence-to-sequence model trained to predict the spelling of a token from its pronunciation in context. In an ideal phonographic system, the model should need to attend to only the current token in order to compute how to spell it, and this would show in the attention matrix activations. In contrast, with a logographic system, where a given pronunciation might correspond to several different spellings, the model would need to attend to a broader context. The ratio of the activation outside the token and the total activation forms the basis of our measure. We compare this with a simple lexical measure, and an entropic measure, as well as several other neural models, and argue that on balance our attention-based measure accords best with intuition about how logographic various systems are. Our work provides the first quantifiable measure of the notion of logography that accords with linguistic intuition and, we argue, provides better insight into what this notion means.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Gianluca Teza ◽  
Michele Caraglio ◽  
Attilio L. Stella

AbstractWe show how the Shannon entropy function can be used as a basis to set up complexity measures weighting the economic efficiency of countries and the specialization of products beyond bare diversification. This entropy function guarantees the existence of a fixed point which is rapidly reached by an iterative scheme converging to our self-consistent measures. Our approach naturally allows to decompose into inter-sectorial and intra-sectorial contributions the country competitivity measure if products are partitioned into larger categories. Besides outlining the technical features and advantages of the method, we describe a wide range of results arising from the analysis of the obtained rankings and we benchmark these observations against those established with other economical parameters. These comparisons allow to partition countries and products into various main typologies, with well-revealed characterizing features. Our methods have wide applicability to general problems of ranking in bipartite networks.


2021 ◽  
Author(s):  
Rosario Lo Franco ◽  
Farzam Nosrati ◽  
Alessia Castellini ◽  
Giuseppe Compagno
Keyword(s):  

Entropy ◽  
2021 ◽  
Vol 23 (2) ◽  
pp. 248
Author(s):  
Luiz Célio S. Rocha ◽  
Mariana S. Rocha ◽  
Paulo Rotella Junior ◽  
Giancarlo Aquila ◽  
Rogério S. Peruchi ◽  
...  

The high proportion of CO2/CH4 in low aggregated value natural gas compositions can be used strategically and intelligently to produce more hydrocarbons through oxidative methane coupling (OCM). The main goal of this study was to optimize direct low-value natural gas conversion via CO2-OCM on metal oxide catalysts using robust multi-objective optimization based on an entropic measure to choose the most preferred Pareto optimal point as the problem’s final solution. The responses of CH4 conversion, C2 selectivity, and C2 yield are modeled using the response surface methodology. In this methodology, decision variables, e.g., the CO2/CH4 ratio, reactor temperature, wt.% CaO and wt.% MnO in ceria catalyst, are all employed. The Pareto optimal solution was obtained via the following combination of process parameters: CO2/CH4 ratio = 2.50, reactor temperature = 1179.5 K, wt.% CaO in ceria catalyst = 17.2%, wt.% MnO in ceria catalyst = 6.0%. By using the optimal weighting strategy w1 = 0.2602, w2 = 0.3203, w3 = 0.4295, the simultaneous optimal values for the objective functions were: CH4 conversion = 8.806%, C2 selectivity = 51.468%, C2 yield = 3.275%. Finally, an entropic measure used as a decision-making criterion was found to be useful in mapping the regions of minimal variation among the Pareto optimal responses and the results obtained, and this demonstrates that the optimization weights exert influence on the forecast variation of the obtained response.


2020 ◽  
Vol 48 (1) ◽  
pp. 97-129
Author(s):  
Renjie Wang ◽  
Cody Hyndman ◽  
Anastasis Kratsios
Keyword(s):  

Entropy ◽  
2019 ◽  
Vol 21 (3) ◽  
pp. 222
Author(s):  
Leonid Martyushev ◽  
Evgenii Shaiapin

A hypothesis proposed in the paper Entropy (Martyushev, L.M. Entropy 2017, 19, 345) on the deductive formulation of a physical theory based on explicitly- and universally-introduced basic concepts is further developed. An entropic measure of time with a number of properties leading to an analog of the Galileo–Einstein relativity principle is considered. Using this measure and a simple model, a kinematic law which relates time to the size and number of particles of a system is obtained. Corollaries of this law are examined. In particular, accelerated growth of the system size is obtained, whereas in systems with constant size, a decrease in the number of particles is observed. An interesting corollary is the emergence of repulsive and attractive forces inversely proportional to the square of the system size for relatively dense systems and constant for systems with sufficiently low density.


Author(s):  
Leonid Martyushev ◽  
Evgenii Shaiapin

An idea expressed in the paper [Entropy 2017, 19, 345] about the deductive formulation of a physical theory resting on explicitly- and universally-introduced basic concepts is developed. An entropic measure of time with a number of properties leading to an analog of the Galilei–Einstein relativity principle is considered. Using the introduced measure and a simple model, a kinematic law relating the size, time, and number of particles of a system is obtained. Corollaries of this law are examined. In particular, accelerated increase of the system size and, if the system size remains unchanged, decrease of the number of particles are found. An interesting corollary is the emergence of repulsive and attractive forces inversely proportional to the square of the system size for relatively dense systems and constant for sufficiently rarefied systems.


Entropy ◽  
2018 ◽  
Vol 20 (9) ◽  
pp. 662 ◽  
Author(s):  
Edgar Parker

After the 2008 financial collapse, the now popular measure of implied systemic risk called the absorption ratio was introduced. This statistic measures how closely the economy’s markets are coupled. The more closely financial markets are coupled the more susceptible they are to systemic collapse. A new alternative measure of financial market health, the implied information processing ratio or entropic efficiency of the economy, was derived using concepts from information theory. This new entropic measure can also be useful in predicting economic downturns and measuring systematic risk. In the current work, the relationship between these two ratios and types of risks are explored. Potential methods of the joint use of these different measures to optimally reduce systemic and systematic risk are introduced.


2017 ◽  
Vol 530 (7) ◽  
pp. 1700188 ◽  
Author(s):  
Rafael Augusto Couceiro Correa ◽  
Davi Monteiro Dantas ◽  
Pedro Henrique Ribeiro da Silva Moraes ◽  
Alvaro de Souza Dutra ◽  
Carlos Alberto Santos de Almeida

Sign in / Sign up

Export Citation Format

Share Document