Mutual information and conditional mean estimation in Poisson channels

  • Guo D
  • Shamai S
  • Verdú S
  • 34


    Mendeley users who have this article in their library.
  • 52


    Citations of this article.


Following the discovery of a fundamental connection between information measures and estimation measures in Gaussian channels, this paper explores the counterpart of those results in Poisson channels. In the continuous-time setting, the received signal is a doubly stochastic Poisson point process whose rate is equal to the input signal plus a dark current. It is found that, regardless of the statistics of the input, the derivative of the input-output mutual information with respect to the intensity of the additive dark current can be expressed as the expected difference between the logarithm of the input and the logarithm of its noncausal conditional mean estimate. The same holds for the derivative with respect to input scaling, but with the logarithmic function replaced by x log x. Similar relationships hold for discrete-time versions of the channel where the outputs are Poisson random variables conditioned on the input symbols.

Author-supplied keywords

  • Mutual information
  • Nonlinear filtering
  • Optimal estimation
  • Point process
  • Poisson process
  • Smoothing

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document


  • Dongning Guo

  • Shlomo Shamai

  • Sergio Verdú

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free