Efficient Lifelong Relation Extraction with Dynamic Regularization

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Relation extraction has received increasing attention due to its important role in natural language processing applications. However, most existing methods are designed for a fixed set of relations. They are unable to handle the lifelong learning scenario, i.e. adapting a well-trained model to newly added relations without catastrophically forgetting the previously learned knowledge. In this work, we present a memory-efficient dynamic regularization method to address this issue. Specifically, two types of powerful consolidation regularizers are applied to preserve the learned knowledge and ensure the robustness of the model, and the regularization strength is adaptively adjusted with respect to the dynamics of the training losses. Experiment results on multiple benchmarks show that our proposed method significantly outperforms prior state-of-the-art approaches.

Cite

CITATION STYLE

APA

Shen, H., Ju, S., Sun, J., Chen, R., & Liu, Y. (2020). Efficient Lifelong Relation Extraction with Dynamic Regularization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12431 LNAI, pp. 181–192). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-60457-8_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free