Language models based on semantic composition

54Citations
Citations of this article
143Readers
Mendeley users who have this article in their library.

Abstract

In this paper we propose a novel statistical language model to capture long-range semantic dependencies. Specifically, we apply the concept of semantic composition to the problem of constructing predictive history representations for upcoming words. We also examine the influence of the underlying semantic space on the composition task by comparing spatial semantic representations against topic-based ones. The composition models yield reductions in perplexity when combined with a standard n-gram language model over the n-gram model alone. We also obtain perplexity reductions when integrating our models with a structured language model. © 2009 ACL and AFNLP.

Cite

CITATION STYLE

APA

Mitchell, J., & Lapata, M. (2009). Language models based on semantic composition. In EMNLP 2009 - Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: A Meeting of SIGDAT, a Special Interest Group of ACL, Held in Conjunction with ACL-IJCNLP 2009 (pp. 430–439). Association for Computational Linguistics (ACL). https://doi.org/10.3115/1699510.1699567

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free