Conditional computational entropy, or toward separating pseudoentropy from compressibility

72Citations
Citations of this article
38Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We study conditional computational entropy: the amount of randomness a distribution appears to have to a computationally bounded observer who is given some correlated information. By considering conditional versions of HILL entropy (based on indistinguishability from truly random distributions) and Yao entropy (based on incompressibility), we obtain: - a separation between conditional HILL and Yao entropies (which can be viewed as a separation between the traditional HILL and Yao entropies in the shared random string model, improving on Wee's 2004 separation in the random oracle model); - the first demonstration of a distribution from which extraction techniques based on Yao entropy produce more pseudorandom bits than appears possible by the traditional HILL-entropy-based techniques; - a new, natural notion of unpredictability entropy, which implies conditional Yao entropy and thus allows for known extraction and hardcore bit results to be stated and used more generally. © International Association for Cryptology Research 2007.

Cite

CITATION STYLE

APA

Hsiao, C. Y., Lu, C. J., & Reyzin, L. (2007). Conditional computational entropy, or toward separating pseudoentropy from compressibility. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4515 LNCS, pp. 169–186). Springer Verlag. https://doi.org/10.1007/978-3-540-72540-4_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free