scholarly journals Generalised Measures of Multivariate Information Content

Entropy ◽  
2020 ◽  
Vol 22 (2) ◽  
pp. 216 ◽  
Author(s):  
Conor Finn ◽  
Joseph Lizier

The entropy of a pair of random variables is commonly depicted using a Venn diagram. This representation is potentially misleading, however, since the multivariate mutual information can be negative. This paper presents new measures of multivariate information content that can be accurately depicted using Venn diagrams for any number of random variables. These measures complement the existing measures of multivariate mutual information and are constructed by considering the algebraic structure of information sharing. It is shown that the distinct ways in which a set of marginal observers can share their information with a non-observing third party corresponds to the elements of a free distributive lattice. The redundancy lattice from partial information decomposition is then subsequently and independently derived by combining the algebraic structures of joint and shared information content.

2020 ◽  
Vol 24 (6) ◽  
pp. 3097-3109
Author(s):  
Aronne Dell'Oca ◽  
Alberto Guadagnini ◽  
Monica Riva

Abstract. We employ elements of information theory to quantify (i) the information content related to data collected at given measurement scales within the same porous medium domain and (ii) the relationships among information contents of datasets associated with differing scales. We focus on gas permeability data collected over Berea Sandstone and Topopah Spring Tuff blocks, considering four measurement scales. We quantify the way information is shared across these scales through (i) the Shannon entropy of the data associated with each support scale, (ii) mutual information shared between data taken at increasing support scales, and (iii) multivariate mutual information shared within triplets of datasets, each associated with a given scale. We also assess the level of uniqueness, redundancy and synergy (rendering, i.e., information partitioning) of information content that the data associated with the intermediate and largest scales provide with respect to the information embedded in the data collected at the smallest support scale in a triplet. Highlights. Information theory allows characterization of the information content of permeability data related to differing measurement scales. An increase in the measurement scale is associated with quantifiable loss of information about permeability. Redundant, unique and synergetic contributions of information are evaluated for triplets of permeability datasets, each taken at a given scale.


Author(s):  
A. J. Gutknecht ◽  
M. Wibral ◽  
A. Makkeh

Partial information decomposition (PID) seeks to decompose the multivariate mutual information that a set of source variables contains about a target variable into basic pieces, the so-called ‘atoms of information’. Each atom describes a distinct way in which the sources may contain information about the target. For instance, some information may be contained uniquely in a particular source, some information may be shared by multiple sources and some information may only become accessible synergistically if multiple sources are combined. In this paper, we show that the entire theory of PID can be derived, firstly, from considerations of part-whole relationships between information atoms and mutual information terms, and secondly, based on a hierarchy of logical constraints describing how a given information atom can be accessed. In this way, the idea of a PID is developed on the basis of two of the most elementary relationships in nature: the part-whole relationship and the relation of logical implication. This unifying perspective provides insights into pressing questions in the field such as the possibility of constructing a PID based on concepts other than redundant information in the general n-sources case. Additionally, it admits of a particularly accessible exposition of PID theory.


2019 ◽  
Author(s):  
Aronne Dell'Oca ◽  
Alberto Guadagnini ◽  
Monica Riva

Abstract. We employ elements of Information Theory to quantify (i) the information content related to data collected at given measurement scales within the same porous medium domain, and (ii) the relationships among Information contents of datasets associated with differing scales. We focus on gas permeability data collected over a Berea Sandstone and a Topopah Spring Tuff blocks, considering four measurement scales. We quantify the way information is shared across these scales through (i) the Shannon entropy of the data associated with each support scale, (ii) mutual information shared between data taken at increasing support scales, and (iii) multivariate mutual information shared within triplets of datasets, each associated with a given scale. We also assess the level of uniqueness, redundancy and synergy (rendering, i.e., the information partitioning) of information content that the data associated with the intermediate and largest scales provide with respect to the information embedded in the data collected at the smallest support scale in a triplet.


Author(s):  
Greg Ver Steeg

Learning by children and animals occurs effortlessly and largely without obvious supervision. Successes in automating supervised learning have not translated to the more ambiguous realm of unsupervised learning where goals and labels are not provided. Barlow (1961) suggested that the signal that brains leverage for unsupervised learning is dependence, or redundancy, in the sensory environment. Dependence can be characterized using the information-theoretic multivariate mutual information measure called total correlation. The principle of Total Cor-relation Ex-planation (CorEx) is to learn representations of data that "explain" as much dependence in the data as possible. We review some manifestations of this principle along with successes in unsupervised learning problems across diverse domains including human behavior, biology, and language.


Author(s):  
Donald Quicke ◽  
Buntika A. Butcher ◽  
Rachel Kruft Welton
Keyword(s):  

Abstract This chapter focuses on sets and Venn diagrams. Venn diagrams, also known as set diagrams, are commonly used to represent the overlap between sets. However, there is no in-built Venn diagram function in R so packages are used.


2020 ◽  
Vol 2020 ◽  
pp. 1-9
Author(s):  
Ying Zhou ◽  
Dazhuan Xu ◽  
Chao Shi ◽  
Weilin Tu ◽  
Junpeng Shi

In this paper, the mutual information between the received signals and the source in the coprime linear array is investigated. In Shannon’s information theory, the mutual information is used to quantify the reduction in the priori uncertainty of the transmitted message. Similarly, the spatial information in the coprime array is the mutual information between direction of arrival (DOA), source amplitude, and received signals. Such information content is composed of two parts. The first part is DOA information, and the second one is scattering information. In a single source scenario, we derive the theoretical expression and its asymptotic upper bound of DOA information. The corresponding expression of scattering information is also formulated theoretically. Besides, the application of spatial information is discussed. We can obtain the optimal array configuration by maximizing the DOA information of the coprime array. Similarly, the information is also used to quantify the performance difference between the coprime array and uniform array. In addition, the entropy error is employed to evaluate the estimation performance based on spatial information. Numerical simulation of the information content confirms our theoretical analysis. The results in this paper have important guiding significance for the design of the coprime array in the actual environment.


Sign in / Sign up

Export Citation Format

Share Document