Appendix: On common information and related characteristics of correlated information sources

16Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This is a literal copy of a manuscript from 1974. References have been updated. It contains a critical discussion of in those days recent concepts of "common information" and suggests also alternative definitions. (Compare pages 402-405 in the book by I. Csiszár, J. Körner "Information Theory: Coding Theorems for Discrete Memoryless Systems", Akademiai Kiado, Budapest 1981.) One of our definitions gave rise to the now well-known source coding problem for two helpers (formulated in 2.) on page 7). More importantly, an extension of one concept to "common information with list knowledge" has recently (R. Ahlswede and V. Balakirsky "Identification under Random Processes" invited paper in honor of Mark Pinsker, Sept. 1995) turned out to play a key role in analyzing the contribution of a correlated source to the identification capacity of a channel. Thus the old ideas have led now to concepts of operational significance and therefore are made accessible here. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Ahlswede, R., & Körner, J. (2006). Appendix: On common information and related characteristics of correlated information sources. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4123 LNCS, pp. 664–677). https://doi.org/10.1007/11889342_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free