Mutual information functions versus correlation functions

356Citations
Citations of this article
256Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper studies one application of mutual information to symbolic sequences: the mutual information function M(d). This function is compared with the more frequently used correlation function Γ(d). An exact relation between M(d) and Γ(d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular, Γ(d)=0 may or may not lead to M(d)=0. This linear, but not general, independence between symbols separated by a distance is studied for ternary sequences. Also included is the estimation of the finite-size effect on calculating mutual information. Finally, the concept of "symbolic noise" is discussed. © 1990 Plenum Publishing Corporation.

Cite

CITATION STYLE

APA

Li, W. (1990). Mutual information functions versus correlation functions. Journal of Statistical Physics, 60(5–6), 823–837. https://doi.org/10.1007/BF01025996

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free