Knowledge-Aware Meta-learning for Low-Resource Text Classification

7Citations
Citations of this article
82Readers
Mendeley users who have this article in their library.

Abstract

Meta-learning has achieved great success in leveraging the historical learned knowledge to facilitate the learning process of the new task. However, merely learning the knowledge from the historical tasks, adopted by current meta-learning algorithms, may not generalize well to testing tasks when they are not well-supported by training tasks. This paper studies a low-resource text classification problem and bridges the gap between meta-training and meta-testing tasks by leveraging the external knowledge bases. Specifically, we propose KGML to introduce additional representation for each sentence learned from the extracted sentence-specific knowledge graph. The extensive experiments on three datasets demonstrate the effectiveness of KGML under both supervised adaptation and unsupervised adaptation settings.

Cite

CITATION STYLE

APA

Yao, H., Wu, Y., Al-Shedivat, M., & Xing, E. P. (2021). Knowledge-Aware Meta-learning for Low-Resource Text Classification. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 1814–1821). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.136

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free