Continual learning long short term memory

7Citations
Citations of this article
63Readers
Mendeley users who have this article in their library.

Abstract

Catastrophic forgetting in neural networks indicates the performance decreasing of deep learning models on previous tasks while learning new tasks. To address this problem, we propose a novel Continual Learning Long Short Term Memory (CL-LSTM) cell in Recurrent Neural Network (RNN) in this paper. CL-LSTM considers not only the state of each individual task’s output gates but also the correlation of the states between tasks, so that the deep learning models can incrementally learn new tasks without catastrophically forgetting previously tasks. Experimental results demonstrate significant improvements of CL-LSTM over state-of-the-art approaches on spoken language understanding (SLU) tasks.

Cite

CITATION STYLE

APA

Guo, X., Tian, Y., Xue, Q., Lampropoulos, P., Eliuk, S., Barner, K., & Wang, X. (2020). Continual learning long short term memory. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 1817–1822). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.164

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free