Meta-learning improves lifelong relation extraction

27Citations
Citations of this article
109Readers
Mendeley users who have this article in their library.

Abstract

Most existing relation extraction models assume a fixed set of relations and are unable to adapt to exploit newly available supervision data to extract new relations. In order to alleviate such problems, there is the need to develop approaches that make relation extraction models capable of continuous adaptation and learning. We investigate and present results for such an approach, based on a combination of ideas from lifelong learning and optimizationbased meta-learning. We evaluate the proposed approach on two recent lifelong relation extraction benchmarks, and demonstrate that it markedly outperforms current state-of-the-art approaches.

Cite

CITATION STYLE

APA

Obamuyide, A., & Vlachos, A. (2019). Meta-learning improves lifelong relation extraction. In ACL 2019 - 4th Workshop on Representation Learning for NLP, RepL4NLP 2019 - Proceedings of the Workshop (pp. 224–229). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w19-4326

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free