While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we generalize the skip-gram model with negative sampling introduced by Mikolov et al. to include arbitrary contexts. In particular, we perform experiments with dependency-based contexts, and show that they produce markedly different embeddings. The dependencybased embeddings are less topical and exhibit more functional similarity than the original skip-gram embeddings. © 2014 Association for Computational Linguistics.
CITATION STYLE
Levy, O., & Goldberg, Y. (2014). Dependency-based word embeddings. In 52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Proceedings of the Conference (Vol. 2, pp. 302–308). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p14-2050
Mendeley helps you to discover research relevant for your work.