scholarly journals On Relations Between the Relative Entropy and χ2-Divergence, Generalizations and Applications

Entropy ◽  
2020 ◽  
Vol 22 (5) ◽  
pp. 563 ◽  
Author(s):  
Tomohiro Nishiyama ◽  
Igal Sason

The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data–processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.

Author(s):  
Cecilia Viscardi ◽  
Michele Boreale ◽  
Fabio Corradi

AbstractWe consider the problem of sample degeneracy in Approximate Bayesian Computation. It arises when proposed values of the parameters, once given as input to the generative model, rarely lead to simulations resembling the observed data and are hence discarded. Such “poor” parameter proposals do not contribute at all to the representation of the parameter’s posterior distribution. This leads to a very large number of required simulations and/or a waste of computational resources, as well as to distortions in the computed posterior distribution. To mitigate this problem, we propose an algorithm, referred to as the Large Deviations Weighted Approximate Bayesian Computation algorithm, where, via Sanov’s Theorem, strictly positive weights are computed for all proposed parameters, thus avoiding the rejection step altogether. In order to derive a computable asymptotic approximation from Sanov’s result, we adopt the information theoretic “method of types” formulation of the method of Large Deviations, thus restricting our attention to models for i.i.d. discrete random variables. Finally, we experimentally evaluate our method through a proof-of-concept implementation.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 858
Author(s):  
Dongshan He ◽  
Qingyu Cai

In this paper, we present a derivation of the black hole area entropy with the relationship between entropy and information. The curved space of a black hole allows objects to be imaged in the same way as camera lenses. The maximal information that a black hole can gain is limited by both the Compton wavelength of the object and the diameter of the black hole. When an object falls into a black hole, its information disappears due to the no-hair theorem, and the entropy of the black hole increases correspondingly. The area entropy of a black hole can thus be obtained, which indicates that the Bekenstein–Hawking entropy is information entropy rather than thermodynamic entropy. The quantum corrections of black hole entropy are also obtained according to the limit of Compton wavelength of the captured particles, which makes the mass of a black hole naturally quantized. Our work provides an information-theoretic perspective for understanding the nature of black hole entropy.


2017 ◽  
Vol 28 (7) ◽  
pp. 954-966 ◽  
Author(s):  
Colin Bannard ◽  
Marla Rosner ◽  
Danielle Matthews

Of all the things a person could say in a given situation, what determines what is worth saying? Greenfield’s principle of informativeness states that right from the onset of language, humans selectively comment on whatever they find unexpected. In this article, we quantify this tendency using information-theoretic measures and report on a study in which we tested the counterintuitive prediction that children will produce words that have a low frequency given the context, because these will be most informative. Using corpora of child-directed speech, we identified adjectives that varied in how informative (i.e., unexpected) they were given the noun they modified. In an initial experiment ( N = 31) and in a replication ( N = 13), 3-year-olds heard an experimenter use these adjectives to describe pictures. The children’s task was then to describe the pictures to another person. As the information content of the experimenter’s adjective increased, so did children’s tendency to comment on the feature that adjective had encoded. Furthermore, our analyses suggest that children balance informativeness with a competing drive to ease production.


2020 ◽  
Vol 9 (5) ◽  
Author(s):  
Anjishnu Bose ◽  
Parthiv Haldar ◽  
Aninda Sinha ◽  
Pritish Sinha ◽  
Shaswat Tiwari

We consider entanglement measures in 2-2 scattering in quantum field theories, focusing on relative entropy which distinguishes two different density matrices. Relative entropy is investigated in several cases which include \phi^4ϕ4 theory, chiral perturbation theory (\chi PTχPT) describing pion scattering and dilaton scattering in type II superstring theory. We derive a high energy bound on the relative entropy using known bounds on the elastic differential cross-sections in massive QFTs. In \chi PTχPT, relative entropy close to threshold has simple expressions in terms of ratios of scattering lengths. Definite sign properties are found for the relative entropy which are over and above the usual positivity of relative entropy in certain cases. We then turn to the recent numerical investigations of the S-matrix bootstrap in the context of pion scattering. By imposing these sign constraints and the \rhoρ resonance, we find restrictions on the allowed S-matrices. By performing hypothesis testing using relative entropy, we isolate two sets of S-matrices living on the boundary which give scattering lengths comparable to experiments but one of which is far from the 1-loop \chi PTχPT Adler zeros. We perform a preliminary analysis to constrain the allowed space further, using ideas involving positivity inside the extended Mandelstam region, and other quantum information theoretic measures based on entanglement in isospin.


2017 ◽  
Vol 114 (11) ◽  
pp. 592-622 ◽  
Author(s):  
H. K. Andersen ◽  

This paper articulates an account of causation as a collection of information-theoretic relationships between patterns instantiated in the causal nexus. I draw on Dennett’s account of real patterns to characterize potential causal relata as patterns with specific identification criteria and noise tolerance levels, and actual causal relata as those patterns instantiated at some spatiotemporal location in the rich causal nexus as originally developed by Salmon. I develop a representation framework using phase space to precisely characterize causal relata, including their degree(s) of counterfactual robustness, causal profiles, causal connectivity, and privileged grain size. By doing so, I show how the philosophical notion of causation can be rendered in a format that is amenable for direct application of mathematical techniques from information theory such that the resulting informational measures are causal informational measures. This account provides a metaphysics of causation that supports interventionist semantics and causal modeling and discovery techniques.


2002 ◽  
Vol 11 (1) ◽  
pp. 79-95 ◽  
Author(s):  
DUDLEY STARK ◽  
A. GANESH ◽  
NEIL O’CONNELL

We study the asymptotic behaviour of the relative entropy (to stationarity) for a commonly used model for riffle shuffling a deck of n cards m times. Our results establish and were motivated by a prediction in a recent numerical study of Trefethen and Trefethen. Loosely speaking, the relative entropy decays approximately linearly (in m) for m < log2n, and approximately exponentially for m > log2n. The deck becomes random in this information-theoretic sense after m = 3/2 log2n shuffles.


Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


In previous chapters, the authors provided a comprehensive framework that can be used in the formal probabilistic and information-theoretic analysis of a wide range of systems and protocols. In this chapter, they illustrate the usefulness of conducting this analysis using theorem proving by tackling a number of applications including a data compression application, the formal analysis of an anonymity-based MIX channel, and the properties of the onetime pad encryption system.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 444
Author(s):  
Stephen Fox ◽  
Adrian Kotelba

Amidst certainty, efficiency can improve sustainability by reducing resource consumption. However, flexibility is needed to be able to survive when uncertainty increases. Apropos, sustainable production cannot persist in the long-term without having both flexibility and efficiency. Referring to cognitive science to inform the development of production systems is well established. However, recent research in cognitive science encompassing flexibility and efficiency in brain functioning have not been considered previously. In particular, research by others that encompasses information (I), information entropy (H), relative entropy (D), transfer entropy (TE), and brain entropy. By contrast, in this paper, flexibility and efficiency for persistent sustainable production is analyzed in relation to these information theory applications in cognitive science and is quantified in terms of information. Thus, this paper is consistent with the established practice of referring to cognitive science to inform the development of production systems. However, it is novel in addressing the need to combine flexibility and efficiency for persistent sustainability in terms of cognitive functioning as modelled with information theory.


Sign in / Sign up

Export Citation Format

Share Document