scholarly journals Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information

Entropy ◽  
2019 ◽  
Vol 21 (10) ◽  
pp. 969 ◽  
Author(s):  
Changxiao Cai ◽  
Sergio Verdú

Rényi-type generalizations of entropy, relative entropy and mutual information have found numerous applications throughout information theory and beyond. While there is consensus that the ways A. Rényi generalized entropy and relative entropy in 1961 are the “right” ones, several candidates have been put forth as possible mutual informations of order α . In this paper we lend further evidence to the notion that a Bayesian measure of statistical distinctness introduced by R. Sibson in 1969 (closely related to Gallager’s E 0 function) is the most natural generalization, lending itself to explicit computation and maximization, as well as closed-form formulas. This paper considers general (not necessarily discrete) alphabets and extends the major analytical results on the saddle-point and saddle-level of the conditional relative entropy to the conditional Rényi divergence. Several examples illustrate the main application of these results, namely, the maximization of α -mutual information with and without constraints.

Author(s):  
QINGHUA HU ◽  
DAREN YU

Yager's entropy was proposed to compute the information of fuzzy indiscernibility relation. In this paper we present a novel interpretation of Yager's entropy in discernibility power of a relation point of view. Then some basic definitions in Shannon's information theory are generalized based on Yager's entropy. We introduce joint entropy, conditional entropy, mutual information and relative entropy to compute the information changes for fuzzy indiscerniblity relation operations. Conditional entropy and relative conditional entropy are proposed to measure the information increment, which is interpreted as the significance of an attribute in fuzzy rough set model. As an application, we redefine independency of an attribute set, reduct, relative reduct in fuzzy rough set model based on Yager's entropy. Some experimental results show the proposed approach is suitable for fuzzy and numeric data reduction.


Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 720 ◽  
Author(s):  
Sergio Verdú

We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences.


2021 ◽  
Vol 2021 (7) ◽  
Author(s):  
Dipankar Barman ◽  
Subhajit Barman ◽  
Bibhas Ranjan Majhi

Abstract We investigate the effects of field temperature T(f) on the entanglement harvesting between two uniformly accelerated detectors. For their parallel motion, the thermal nature of fields does not produce any entanglement, and therefore, the outcome is the same as the non-thermal situation. On the contrary, T(f) affects entanglement harvesting when the detectors are in anti-parallel motion, i.e., when detectors A and B are in the right and left Rindler wedges, respectively. While for T(f) = 0 entanglement harvesting is possible for all values of A’s acceleration aA, in the presence of temperature, it is possible only within a narrow range of aA. In (1 + 1) dimensions, the range starts from specific values and extends to infinity, and as we increase T(f), the minimum required value of aA for entanglement harvesting increases. Moreover, above a critical value aA = ac harvesting increases as we increase T(f), which is just opposite to the accelerations below it. There are several critical values in (1 + 3) dimensions when they are in different accelerations. Contrary to the single range in (1 + 1) dimensions, here harvesting is possible within several discrete ranges of aA. Interestingly, for equal accelerations, one has a single critical point, with nature quite similar to (1 + 1) dimensional results. We also discuss the dependence of mutual information among these detectors on aA and T(f).


2021 ◽  
Vol 2021 (3) ◽  
Author(s):  
Lucas Daguerre ◽  
Raimel Medina ◽  
Mario Solís ◽  
Gonzalo Torroba

Abstract We study different aspects of quantum field theory at finite density using methods from quantum information theory. For simplicity we focus on massive Dirac fermions with nonzero chemical potential, and work in 1 + 1 space-time dimensions. Using the entanglement entropy on an interval, we construct an entropic c-function that is finite. Unlike what happens in Lorentz-invariant theories, this c-function exhibits a strong violation of monotonicity; it also encodes the creation of long-range entanglement from the Fermi surface. Motivated by previous works on lattice models, we next calculate numerically the Renyi entropies and find Friedel-type oscillations; these are understood in terms of a defect operator product expansion. Furthermore, we consider the mutual information as a measure of correlation functions between different regions. Using a long-distance expansion previously developed by Cardy, we argue that the mutual information detects Fermi surface correlations already at leading order in the expansion. We also analyze the relative entropy and its Renyi generalizations in order to distinguish states with different charge and/or mass. In particular, we show that states in different superselection sectors give rise to a super-extensive behavior in the relative entropy. Finally, we discuss possible extensions to interacting theories, and argue for the relevance of some of these measures for probing non-Fermi liquids.


Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1423
Author(s):  
Javier Bonilla ◽  
Daniel Vélez ◽  
Javier Montero ◽  
J. Tinguaro Rodríguez

In the last two decades, information entropy measures have been relevantly applied in fuzzy clustering problems in order to regularize solutions by avoiding the formation of partitions with excessively overlapping clusters. Following this idea, relative entropy or divergence measures have been similarly applied, particularly to enable that kind of entropy-based regularization to also take into account, as well as interact with, cluster size variables. Particularly, since Rényi divergence generalizes several other divergence measures, its application in fuzzy clustering seems promising for devising more general and potentially more effective methods. However, previous works making use of either Rényi entropy or divergence in fuzzy clustering, respectively, have not considered cluster sizes (thus applying regularization in terms of entropy, not divergence) or employed divergence without a regularization purpose. Then, the main contribution of this work is the introduction of a new regularization term based on Rényi relative entropy between membership degrees and observation ratios per cluster to penalize overlapping solutions in fuzzy clustering analysis. Specifically, such Rényi divergence-based term is added to the variance-based Fuzzy C-means objective function when allowing cluster sizes. This then leads to the development of two new fuzzy clustering methods exhibiting Rényi divergence-based regularization, the second one extending the first by considering a Gaussian kernel metric instead of the Euclidean distance. Iterative expressions for these methods are derived through the explicit application of Lagrange multipliers. An interesting feature of these expressions is that the proposed methods seem to take advantage of a greater amount of information in the updating steps for membership degrees and observations ratios per cluster. Finally, an extensive computational study is presented showing the feasibility and comparatively good performance of the proposed methods.


Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


This chapter presents a higher-order-logic formalization of the main concepts of information theory (Cover & Thomas, 1991), such as the Shannon entropy and mutual information, using the formalization of the foundational theories of measure, Lebesgue integration, and probability. The main results of the chapter include the formalizations of the Radon-Nikodym derivative and the Kullback-Leibler (KL) divergence (Coble, 2010). The latter provides a unified framework based on which most of the commonly used measures of information can be defined. The chapter then provides the general definitions that are valid for both discrete and continuous cases and then proves the corresponding reduced expressions where the measures considered are absolutely continuous over finite spaces.


Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 778 ◽  
Author(s):  
Amos Lapidoth ◽  
Christoph Pfister

Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.


Sign in / Sign up

Export Citation Format

Share Document