Extension of de Bruijn's identity to dependent non-Gaussian noise channels

2016 ◽  
Vol 53 (2) ◽  
pp. 360-368 ◽  
Author(s):  
Nayereh Bagheri Khoolenjani ◽  
Mohammad Hossein Alamatsaz

Abstract De Bruijn's identity relates two important concepts in information theory: Fisher information and differential entropy. Unlike the common practice in the literature, in this paper we consider general additive non-Gaussian noise channels where more realistically, the input signal and additive noise are not independently distributed. It is shown that, for general dependent signal and noise, the first derivative of the differential entropy is directly related to the conditional mean estimate of the input. Then, by using Gaussian and Farlie–Gumbel–Morgenstern copulas, special versions of the result are given in the respective case of additive normally distributed noise. The previous result on independent Gaussian noise channels is included as a special case. Illustrative examples are also provided.

IEEE Access ◽  
2018 ◽  
Vol 6 ◽  
pp. 52607-52615 ◽  
Author(s):  
Qiwei Zheng ◽  
Fanggang Wang ◽  
Bo Ai ◽  
Zhangdui Zhong

2005 ◽  
Vol 15 (09) ◽  
pp. 2985-2994 ◽  
Author(s):  
FRANÇOIS CHAPEAU-BLONDEAU ◽  
DAVID ROUSSEAU

The optimal detection of a signal of known form hidden in additive white noise is examined in the framework of stochastic resonance and noise-aided information processing. Conditions are exhibited where the performance in the optimal detection increases when the level of the additive (non-Gaussian bimodal) noise is raised. On the additive signal–noise mixture, when a threshold quantization is performed prior to the optimal detection, another form of improvement by noise can be obtained, with subthreshold signals and Gaussian noise. Optimization of the quantization threshold shows that even in symmetric detection settings, the optimal threshold can be away from the center of symmetry and in subthreshold configuration of the signals. These properties concerning non-Gaussian noise and nonlinear preprocessing in optimal detection, are meaningful to the current exploration of the various modalities and potentialities of stochastic resonance.


2019 ◽  
Vol 33 (4) ◽  
pp. 618-657 ◽  
Author(s):  
Fatemeh Asgari ◽  
Mohammad Hossein Alamatsaz ◽  
Nayereh Bagheri Khoolenjani

AbstractConsidering the Gaussian noise channel, Costa [4] investigated the concavity of the entropy power when the input signal and noise components are independent. His argument was connected to the first-order derivative of the Fisher information. In real situations, however, the noise can be highly dependent on the main signal. In this paper, we suppose that the input signal and noise variables are dependent. Then, some well-known copula functions are used to define their dependence structure. The first- and second-order derivatives of Fisher information of the model are obtained. Then, by using these derivatives, we will generalize two inequalities based on the Fisher information and a functional that is closely associated to Fisher information for the case when the input signal and noise variables are dependent. We will also show that the previous results for the independent case are recovered as special cases of our result. Several applications are provided to support the usefulness of our results. Finally, the channel capacity of the Gaussian noise channel model with dependent signal and noise is studied.


Sign in / Sign up

Export Citation Format

Share Document