We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three different ways. We also show that this effect has both lexical (which words are used) and non-lexical (how the words are used) causes.
CITATION STYLE
Genzel, D., & Charniak, E. (2002). Entropy Rate Constancy in Text. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2002-July, pp. 199–206). Association for Computational Linguistics (ACL). https://doi.org/10.3115/1073083.1073117
Mendeley helps you to discover research relevant for your work.