scholarly journals Mutual Information between Discrete Variables with Many Categories using Recursive Adaptive Partitioning

2015 ◽  
Vol 5 (1) ◽  
Author(s):  
Junhee Seok ◽  
Yeong Seon Kang
2011 ◽  
Vol 121-126 ◽  
pp. 4203-4207 ◽  
Author(s):  
Lin Huo ◽  
Chuan Lv ◽  
Si Miao Fei ◽  
Dong Zhou

As most Mutual Information method is limited to the correlation analysis between discrete variables in majority and tendency of choosing the characteristic variables with multi-values so far, in this paper we propose a new approach based on Mutual Information to measure the correlation of discrete variables and continuous variables. Then we take the fire control system of aircraft for example to calculate the correlation between fault types and monitor data indexes, and finally find the fault symptom classes.


2016 ◽  
Vol 32 (14) ◽  
pp. 2233-2235 ◽  
Author(s):  
Alexander Lachmann ◽  
Federico M. Giorgi ◽  
Gonzalo Lopez ◽  
Andrea Califano

Entropy ◽  
2019 ◽  
Vol 21 (3) ◽  
pp. 243
Author(s):  
Wentao Huang ◽  
Kechen Zhang

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.


Author(s):  
Wentao Huang ◽  
Kechen Zhang

Information theory is widely used in various disciplines, and effective calculation of Shannon mutual information is typically not an easy task for many practical applications, including problems of neural population coding in computational and theoretical neuroscience. Asymptotic formulas based on Fisher information may provide accurate approximations to mutual information but this approach is restricted to continuous variables because the calculation requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding, and these asymptotic formulas hold true for discrete variables as there is no requirement for differentiability. In particular, one of our approximation formulas has consistent performance and good accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating mutual information between the discrete variables or stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.


Author(s):  
Wentao Huang ◽  
Kechen Zhang

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback--Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.


Entropy ◽  
2018 ◽  
Vol 20 (4) ◽  
pp. 297 ◽  
Author(s):  
Conor Finn ◽  
Joseph Lizier

What are the distinct ways in which a set of predictor variables can provide information about a target variable? When does a variable provide unique information, when do variables share redundant information, and when do variables combine synergistically to provide complementary information? The redundancy lattice from the partial information decomposition of Williams and Beer provided a promising glimpse at the answer to these questions. However, this structure was constructed using a much criticised measure of redundant information, and despite sustained research, no completely satisfactory replacement measure has been proposed. In this paper, we take a different approach, applying the axiomatic derivation of the redundancy lattice to a single realisation from a set of discrete variables. To overcome the difficulty associated with signed pointwise mutual information, we apply this decomposition separately to the unsigned entropic components of pointwise mutual information which we refer to as the specificity and ambiguity. This yields a separate redundancy lattice for each component. Then based upon an operational interpretation of redundancy, we define measures of redundant specificity and ambiguity enabling us to evaluate the partial information atoms in each lattice. These atoms can be recombined to yield the sought-after multivariate information decomposition. We apply this framework to canonical examples from the literature and discuss the results and the various properties of the decomposition. In particular, the pointwise decomposition using specificity and ambiguity satisfies a chain rule over target variables, which provides new insights into the so-called two-bit-copy example.


Author(s):  
Antara Dasgupta ◽  
Renaud Hostache ◽  
RAAJ Ramasankaran ◽  
Guy J.‐P Schumann ◽  
Stefania Grimaldi ◽  
...  

1997 ◽  
Vol 36 (04/05) ◽  
pp. 257-260 ◽  
Author(s):  
H. Saitoh ◽  
T. Yokoshima ◽  
H. Kishida ◽  
H. Hayakawa ◽  
R. J. Cohen ◽  
...  

Abstract:The frequency of ventricular premature beats (VPBs) has been related to the risk of mortality. However, little is known about the temporal pattern of occurrence of VPBs and its relationship to autonomic activity. Hence, we applied a general correlation measure, mutual information, to quantify how VPBs are generated over time. We also used mutual information to determine the correlation between VPB production and heart rate in order to evaluate effects of autonomic activity on VPB production. We examined twenty subjects with more than 3000 VPBs/day and simulated ran-( dom time series of VPB occurrence. We found that mutual information values could be used to characterize quantitatively the temporal patterns of VPB generation. Our data suggest that VPB production is not random and VPBs generated with a higher value of mutual information may be more greatly affected by autonomic activity.


Sign in / Sign up

Export Citation Format

Share Document