mutual entropy
Recently Published Documents


TOTAL DOCUMENTS

60
(FIVE YEARS 1)

H-INDEX

8
(FIVE YEARS 0)

2020 ◽  
Vol 59 (3) ◽  
pp. 1223-1231
Author(s):  
Yanling Li ◽  
Yunliang Wen ◽  
Hexin Lai ◽  
Qiqi Zhao

Entropy ◽  
2020 ◽  
Vol 22 (3) ◽  
pp. 298
Author(s):  
Noboru Watanabe

It has been shown that joint probability distributions of quantum systems generally do not exist, and the key to solving this concern is the compound state invented by Ohya. The Ohya compound state constructed by the Schatten decomposition (i.e., one-dimensional orthogonal projection) of the input state shows the correlation between the states of the input and output systems. In 1983, Ohya formulated the quantum mutual entropy by applying this compound state. Since this mutual entropy satisfies the fundamental inequality, one may say that it represents the amount of information correctly transmitted from the input system through the channel to the output system, and it may play an important role in discussing the efficiency of information transfer in quantum systems. Since the Ohya compound state is separable state, it is important that we must look more carefully into the entangled compound state. This paper is intended as an investigation of the construction of the entangled compound state, and the hybrid entangled compound state is introduced. The purpose of this paper is to consider the validity of the compound states constructing the quantum mutual entropy type complexity. It seems reasonable to suppose that the quantum mutual entropy type complexity defined by using the entangled compound state is not useful to discuss the efficiency of information transmission from the initial system to the final system.


2019 ◽  
Vol 16 (5) ◽  
pp. 366-373
Author(s):  
Xiong Li ◽  
Hui Yang ◽  
Kaifu Wen ◽  
Xiaoming Zhong ◽  
Xuewen Xia ◽  
...  

Background: Epistasis makes complex diseases difficult to understand, especially when heterogeneity also exists. Heterogeneity of complex diseases makes the distribution of case population more confused. However, the traditional methods proposed to detect epistasis often ignore heterogeneity, resulting in low power of association studies. Methods: In this study, we firstly use rank information in the Classification Decision Tree and Mutual Entropy (CTME) to construct two different evaluation scores, namely multiple objectives. In addition, we improve the calculation of joint entropy between SNPs and disease label, which elevates the efficiency of CTME. Then, the ant colony algorithm is applied to search two-locus epistatic combination space. To handle the potential heterogeneity, all candidate two-locus SNPs are merged to recognize multiple different epistatic combinations. Finally, all these solutions are tested by χ2 test. Results and Conclusion: Experiments show that our method CTME improves the power of association study. More importantly, CTME also detects multiple epistatic SNPs contributing to heterogeneity. The experimental results show that CTME has advantages on power and efficiency.


2019 ◽  
Vol 26 (02) ◽  
pp. 1950009 ◽  
Author(s):  
Kyouhei Ohmura ◽  
Noboru Watanabe

The classical dynamical mutual entropy measures the average of information content going through a channel. Classical Markovian sources are important in communication theory since they constitute reasonable models for languages. In this paper, we define the quantum dynamical mutual entropy through quantum Markov chains, and we calculate it for several simple models.


Agronomy ◽  
2019 ◽  
Vol 9 (5) ◽  
pp. 234 ◽  
Author(s):  
Hsieh Fushing ◽  
Olivia Lee ◽  
Constantin Heitkamp ◽  
Hildegarde Heymann ◽  
Susan E. Ebeler ◽  
...  

This study explores the relationships between chemical and sensory characteristics of wines in connection with their regions of production. The objective is to identify whether such characteristics are significant enough to serve as signatures of a terroir for wines, thereby supporting the concept of regionality. We argue that the relationships between characteristics and regions of production for the set of wines under study are rendered complicated by possible non-linear relationships between the characteristics themselves. Consequently, we propose a new approach for performing the analysis of the wine data that relies on these relationships instead of trying to circumvent them. This new approach follows two steps: We first cluster the measurements for each characteristic (chemical, or sensory) independently. We then assign a distance between two features to be the mutual entropy of the clustering results they generate. The set of characteristics is then clustered using this distance measure. The result of this clustering is a set of sub-groups of characteristics, such that two characteristics in the same group carry similar, i.e., synergetic information with respect to the wines under study. Those wines are then analyzed separately on the different sub groups of features. We have used this method to analyze the similarities and differences between Malbec wines from Argentina and California, as well as the similarities and differences between sub-regions of those two main wine producing countries. We report detection of groups of features that characterize the origins of the different wines included in the study. We note stronger evidence of regionality for Argentinian Malbec wines than for Californian wines, at least for the sub regions of production included in this study.


Entropy ◽  
2018 ◽  
Vol 20 (9) ◽  
pp. 679 ◽  
Author(s):  
David Ellerman

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions (“dits”) of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the post-measurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states.


Author(s):  
David Ellerman

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized using the distinctions (`dits') of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional, and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the post-measurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as "two-draw" probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states.


2018 ◽  
Vol 182 ◽  
pp. 02039
Author(s):  
David Ellerman

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized using the distinctions (“dits”) of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional, and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this talk is to outline the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., “qudits” of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the postmeasurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states.


Sign in / Sign up

Export Citation Format

Share Document