Dependency-based word embeddings

901Citations
Citations of this article
1.2kReaders
Mendeley users who have this article in their library.

Abstract

While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we generalize the skip-gram model with negative sampling introduced by Mikolov et al. to include arbitrary contexts. In particular, we perform experiments with dependency-based contexts, and show that they produce markedly different embeddings. The dependencybased embeddings are less topical and exhibit more functional similarity than the original skip-gram embeddings. © 2014 Association for Computational Linguistics.

Cite

CITATION STYLE

APA

Levy, O., & Goldberg, Y. (2014). Dependency-based word embeddings. In 52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Proceedings of the Conference (Vol. 2, pp. 302–308). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p14-2050

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free