Persistence pays off: Paying attention to what the LSTM gating mechanism persists

2Citations
Citations of this article
68Readers
Mendeley users who have this article in their library.

Abstract

Language Models (LMs) are important components in several Natural Language Processing systems. Recurrent Neural Network LMs composed of LSTM units, especially those augmented with an external memory, have achieved state-of-the-art results. However, these models still struggle to process long sequences which are more likely to contain long-distance dependencies because of information fading and a bias towards more recent information. In this paper we demonstrate an effective mechanism for retrieving information in a memory augmented LSTM LM based on attending to information in memory in proportion to the number of timesteps the LSTM gating mechanism persisted the information.

Cite

CITATION STYLE

APA

Salton, G. D., & Kelleher, J. D. (2019). Persistence pays off: Paying attention to what the LSTM gating mechanism persists. In International Conference Recent Advances in Natural Language Processing, RANLP (Vol. 2019-September, pp. 1052–1059). Incoma Ltd. https://doi.org/10.26615/978-954-452-056-4_121

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free