Estimating the Entropy of Linguistic Distributions

3Citations
Citations of this article
47Readers
Mendeley users who have this article in their library.

Abstract

Shannon entropy is often a quantity of interest to linguists studying the communicative capacity of human language. However, entropy must typically be estimated from observed data because researchers do not have access to the underlying probability distribution that gives rise to these data. While entropy estimation is a well-studied problem in other fields, there is not yet a comprehensive exploration of the efficacy of entropy estimators for use with linguistic data. In this work, we fill this void, studying the empirical effectiveness of different entropy estimators for linguistic distributions. In a replication of two recent information-theoretic linguistic studies, we find evidence that the reported effect size is over-estimated due to over-reliance on poor entropy estimators. Finally, we end our paper with concrete recommendations for entropy estimation depending on distribution type and data availability.

Cite

CITATION STYLE

APA

Arora, A., Meister, C., & Cotterell, R. (2022). Estimating the Entropy of Linguistic Distributions. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 175–195). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-short.20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free