Identification entropy

12Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Shannon (1948) has shown that a source (U, P, U) with output U satisfying Prob (U = u) = Pu; can be encoded in a prefix code C = {cu : u ε U} ⊂ {0, 1}*such that for the entropy H(P) = Σ -Pulog pu < Σ pucu HI(P) and thus also uεu that L(P) = min max Lc(P,u) > HI(P) C uεu and related upper bounds, which demonstrate the operational significance of identification entropy in noiseless source coding similar as Shannon entropy does in noiseless data compression. Also other averages such as L̄c (P) = 1-u Σ Lc (P, u) are discussed in uεu particular for Huffman codes where classically equivalent Huffman codes may now be different. We also show that prefix codes, where the codewords correspond to the leaves in a regular binary tree, are universally good for this average. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Ahlswede, R. (2006). Identification entropy. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4123 LNCS, pp. 595–613). https://doi.org/10.1007/11889342_36

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free