Rehearsal-free Continual Language Learning via Efficient Parameter Isolation

20Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

We study the problem of defying catastrophic forgetting when learning a series of language processing tasks. Compared with previous methods, we emphasize the importance of not caching history tasks' data, which makes the problem more challenging. Our proposed method applies the parameter isolation strategy. For each task, it allocates a small portion of private parameters and learns them with a shared pre-trained model. To load correct parameters at testing time, we introduce a simple yet effective non-parametric method. Experiments on continual language learning benchmarks show that our method is significantly better than all existing no-data-cache methods, and is comparable (or even better) than those using historical data.

Cite

CITATION STYLE

APA

Wang, Z., Liu, Y., Ji, T., Wang, X., Wu, Y., Jiang, C., … Zeng, W. (2023). Rehearsal-free Continual Language Learning via Efficient Parameter Isolation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 10933–10946). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.612

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free