This is a literal copy of a manuscript from 1974. References have been updated. It contains a critical discussion of in those days recent concepts of "common information" and suggests also alternative definitions. (Compare pages 402-405 in the book by I. Csiszár, J. Körner "Information Theory: Coding Theorems for Discrete Memoryless Systems", Akademiai Kiado, Budapest 1981.) One of our definitions gave rise to the now well-known source coding problem for two helpers (formulated in 2.) on page 7). More importantly, an extension of one concept to "common information with list knowledge" has recently (R. Ahlswede and V. Balakirsky "Identification under Random Processes" invited paper in honor of Mark Pinsker, Sept. 1995) turned out to play a key role in analyzing the contribution of a correlated source to the identification capacity of a channel. Thus the old ideas have led now to concepts of operational significance and therefore are made accessible here. © Springer-Verlag Berlin Heidelberg 2006.
CITATION STYLE
Ahlswede, R., & Körner, J. (2006). Appendix: On common information and related characteristics of correlated information sources. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4123 LNCS, pp. 664–677). https://doi.org/10.1007/11889342_41
Mendeley helps you to discover research relevant for your work.