Mutual information and conditional mean estimation in Poisson channels

80Citations
Citations of this article
52Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Following the discovery of a fundamental connection between information measures and estimation measures in Gaussian channels, this paper explores the counterpart of those results in Poisson channels. In the continuous-time setting, the received signal is a doubly stochastic Poisson point process whose rate is equal to the input signal plus a dark current. It is found that, regardless of the statistics of the input, the derivative of the input - output mutual information with respect to the intensity of the additive dark current can be expressed as the expected difference between the logarithm of the input and the logarithm of its noncausal conditional mean estimate. The same holds for the derivative with respect to input scaling, but with the logarithmic function replaced by χ log χ. Similar relationships hold for discrete-time versions of the channel where the outputs are Poisson random variables conditioned on the input symbols. © 2008 IEEE.

Cite

CITATION STYLE

APA

Guo, D., Shamai, S., & Verdú, S. (2008). Mutual information and conditional mean estimation in Poisson channels. IEEE Transactions on Information Theory, 54(5), 1837–1849. https://doi.org/10.1109/TIT.2008.920206

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free