Entropy Rate Constancy in Text

148Citations
Citations of this article
198Readers
Mendeley users who have this article in their library.

Abstract

We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three different ways. We also show that this effect has both lexical (which words are used) and non-lexical (how the words are used) causes.

Cite

CITATION STYLE

APA

Genzel, D., & Charniak, E. (2002). Entropy Rate Constancy in Text. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2002-July, pp. 199–206). Association for Computational Linguistics (ACL). https://doi.org/10.3115/1073083.1073117

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free