Online representation learning in recurrent neural language models

4Citations
Citations of this article
123Readers
Mendeley users who have this article in their library.

Abstract

We investigate an extension of continuous online learning in recurrent neural network language models. The model keeps a separate vector representation of the current unit of text being processed and adaptively adjusts it after each prediction. The initial experiments give promising results, indicating that the method is able to increase language modelling accuracy, while also decreasing the parameters needed to store the model along with the computation required at each step.

Cite

CITATION STYLE

APA

Rei, M. (2015). Online representation learning in recurrent neural language models. In Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing (pp. 238–243). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d15-1026

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free