Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders

3Citations
Citations of this article
65Readers
Mendeley users who have this article in their library.

Abstract

Learning prerequisite chains is an essential task for efficiently acquiring knowledge in both known and unknown domains. For example, one may be an expert in the natural language processing (NLP) domain but want to determine the best order to learn new concepts in an unfamiliar Computer Vision domain (CV). Both domains share some common concepts, such as machine learning basics and deep learning models. In this paper, we propose unsupervised cross-domain concept prerequisite chain learning using an optimized variational graph autoencoder. Our model learns to transfer concept prerequisite relations from an information-rich domain (source domain) to an information-poor domain (target domain), substantially surpassing other baseline models. Also, we expand an existing dataset by introducing two new domains-CV and Bioinformatics (BIO). The annotated data and resources, as well as the code, will be made publicly available.

Cite

CITATION STYLE

APA

Li, I., Yan, V., Li, T., Qu, R., & Radev, D. (2021). Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders. In ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 1005–1011). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.acl-short.127

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free