Bringing order to neural word embeddings with embeddings augmented by random permutations (EARP)

16Citations
Citations of this article
80Readers
Mendeley users who have this article in their library.

Abstract

Word order is clearly a vital part of human language, but it has been used comparatively lightly in distributional vector models. This paper presents a new method for incorporating word order information into word vector embedding models by combining the benefits of permutation-based order encoding with the more recent method of skip-gram with negative sampling. The new method introduced here is called Embeddings Augmented by Random Permutations (EARP). It operates by applying permutations to the coordinates of context vector representations during the process of training. Results show an 8% improvement in accuracy on the challenging Bigger Analogy Test Set, and smaller but consistent improvements on other analogy reference sets. These findings demonstrate the importance of order-based information in analogical retrieval tasks, and the utility of random permutations as a means to augment neural embeddings.

Cite

CITATION STYLE

APA

Cohen, T., & Widdows, D. (2018). Bringing order to neural word embeddings with embeddings augmented by random permutations (EARP). In CoNLL 2018 - 22nd Conference on Computational Natural Language Learning, Proceedings (pp. 465–475). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/k18-1045

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free