Towards incremental learning of word embeddings using context informativeness

10Citations
Citations of this article
90Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we investigate the task of learning word embeddings from very sparse data in an incremental, cognitively-plausible way. We focus on the notion of informativeness, that is, the idea that some content is more valuable to the learning process than other. We further highlight the challenges of online learning and argue that previous systems fall short of implementing incrementality. Concretely, we incorporate informativeness in a previously proposed model of nonce learning, using it for context selection and learning rate modulation. We test our system on the task of learning new words from definitions, as well as on the task of learning new words from potentially uninformative contexts. We demonstrate that informativeness is crucial to obtaining state-of-theart performance in a truly incremental setup.

Cite

CITATION STYLE

APA

Kabbach, A., Gulordava, K., & Herbelot, A. (2019). Towards incremental learning of word embeddings using context informativeness. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Student Research Workshop (pp. 162–168). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-2022

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free