We provide two new results for identification for sources. The first result is about block codes. In [Ahlswede and Cai, IEEE-IT, 52(9), 4198-4207, 2006] it is proven that the q-ary identification entropy H I,q (P) is a lower bound for the average number L(P,P) of expected checkings during the identification process. A necessary assumption for this proof is that the uniform distribution minimizes the symmetric running time for binary block codes. This assumption is proved in Sect. 2 not only for binary block codes but for any q-ary block code. The second result is about upper bounds for the worst-case running time. In [Ahlswede, Balkenhol and Kleinewchter, LNCS, 4123, 51-61, 2006] the authors proved in Theorem 3 that L(P) < 3 by an inductive code construction. We discover an alteration of their scheme which strengthens this upper bound significantly. © Springer-Verlag Berlin Heidelberg 2013.
CITATION STYLE
Heup, C. (2013). Two new results for identification for sources. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7777, pp. 1–10). Springer Verlag. https://doi.org/10.1007/978-3-642-36899-8_1
Mendeley helps you to discover research relevant for your work.