scholarly journals Mutual information, relative entropy and estimation error in semi-martingale channels

Author(s):  
Jiantao Jiao ◽  
Kartik Venkat ◽  
Tsachy Weissman
2018 ◽  
Vol 64 (10) ◽  
pp. 6662-6671 ◽  
Author(s):  
Jiantao Jiao ◽  
Kartik Venkat ◽  
Tsachy Weissman

2021 ◽  
Vol 2021 (3) ◽  
Author(s):  
Lucas Daguerre ◽  
Raimel Medina ◽  
Mario Solís ◽  
Gonzalo Torroba

Abstract We study different aspects of quantum field theory at finite density using methods from quantum information theory. For simplicity we focus on massive Dirac fermions with nonzero chemical potential, and work in 1 + 1 space-time dimensions. Using the entanglement entropy on an interval, we construct an entropic c-function that is finite. Unlike what happens in Lorentz-invariant theories, this c-function exhibits a strong violation of monotonicity; it also encodes the creation of long-range entanglement from the Fermi surface. Motivated by previous works on lattice models, we next calculate numerically the Renyi entropies and find Friedel-type oscillations; these are understood in terms of a defect operator product expansion. Furthermore, we consider the mutual information as a measure of correlation functions between different regions. Using a long-distance expansion previously developed by Cardy, we argue that the mutual information detects Fermi surface correlations already at leading order in the expansion. We also analyze the relative entropy and its Renyi generalizations in order to distinguish states with different charge and/or mass. In particular, we show that states in different superselection sectors give rise to a super-extensive behavior in the relative entropy. Finally, we discuss possible extensions to interacting theories, and argue for the relevance of some of these measures for probing non-Fermi liquids.


2014 ◽  
Vol 2014 ◽  
pp. 1-6
Author(s):  
Hua Li ◽  
Jie Zhou

This paper considers the robust estimation fusion problem for distributed multisensor systems with uncertain correlations of local estimation errors. For an uncertain class characterized by the Kullback-Leibler (KL) divergence from the actual model to nominal model of local estimation error covariance, the robust estimation fusion problem is formulated to find a linear minimum variance unbiased estimator for the least favorable model. It is proved that the optimal fuser under nominal correlation model is robust while the estimation error has a relative entropy uncertainty.


Author(s):  
QINGHUA HU ◽  
DAREN YU

Yager's entropy was proposed to compute the information of fuzzy indiscernibility relation. In this paper we present a novel interpretation of Yager's entropy in discernibility power of a relation point of view. Then some basic definitions in Shannon's information theory are generalized based on Yager's entropy. We introduce joint entropy, conditional entropy, mutual information and relative entropy to compute the information changes for fuzzy indiscerniblity relation operations. Conditional entropy and relative conditional entropy are proposed to measure the information increment, which is interpreted as the significance of an attribute in fuzzy rough set model. As an application, we redefine independency of an attribute set, reduct, relative reduct in fuzzy rough set model based on Yager's entropy. Some experimental results show the proposed approach is suitable for fuzzy and numeric data reduction.


Sign in / Sign up

Export Citation Format

Share Document