Nearest neighbor decoding for additive non-Gaussian noise channels

1996 ◽  
Vol 42 (5) ◽  
pp. 1520-1529 ◽  
Author(s):  
A. Lapidoth
IEEE Access ◽  
2018 ◽  
Vol 6 ◽  
pp. 52607-52615 ◽  
Author(s):  
Qiwei Zheng ◽  
Fanggang Wang ◽  
Bo Ai ◽  
Zhangdui Zhong

2016 ◽  
Vol 53 (2) ◽  
pp. 360-368 ◽  
Author(s):  
Nayereh Bagheri Khoolenjani ◽  
Mohammad Hossein Alamatsaz

Abstract De Bruijn's identity relates two important concepts in information theory: Fisher information and differential entropy. Unlike the common practice in the literature, in this paper we consider general additive non-Gaussian noise channels where more realistically, the input signal and additive noise are not independently distributed. It is shown that, for general dependent signal and noise, the first derivative of the differential entropy is directly related to the conditional mean estimate of the input. Then, by using Gaussian and Farlie–Gumbel–Morgenstern copulas, special versions of the result are given in the respective case of additive normally distributed noise. The previous result on independent Gaussian noise channels is included as a special case. Illustrative examples are also provided.


Sign in / Sign up

Export Citation Format

Share Document