Continual Lifelong Learning in Natural Language Processing: A Survey

109Citations
Citations of this article
288Readers
Mendeley users who have this article in their library.

Abstract

Continual learning (CL) aims to enable information systems to learn from a continuous data stream across time. However, it is difficult for existing deep learning architectures to learn a new task without largely forgetting previously acquired knowledge. Furthermore, CL is particularly challenging for language learning, as natural language is ambiguous: it is discrete, compositional, and its meaning is context-dependent. In this work, we look at the problem of CL through the lens of various NLP tasks. Our survey discusses major challenges in CL and current methods applied in neural network models. We also provide a critical review of the existing CL evaluation methods and datasets in NLP. Finally, we present our outlook on future research directions.

Cite

CITATION STYLE

APA

Biesialska, M., Biesialska, K., & Costa-Jussà, M. R. (2020). Continual Lifelong Learning in Natural Language Processing: A Survey. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 6523–6541). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.574

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free