The task of concept prerequisite chain learning is to automatically determine the existence of prerequisite relationships among concept pairs. In this paper, we frame learning prerequisite relationships among concepts as an unsupervised task with no access to labeled concept pairs during training. We propose a model called the Relational-variational Graph AutoEncoder (R-VGAE) to predict concept relations within a graph consisting of concept and resource nodes. Results show that our unsupervised approach outperforms graph-based semi-supervised methods and other baseline methods by up to 9.77% and 10.47% in terms of prerequisite relation prediction accuracy and F1 score. Our method is notably the first graph-based model that attempts to make use of deep learning representations for the task of unsupervised prerequisite learning. We also expand an existing corpus which totals 1, 717 English Natural Language Processing (NLP)-related lecture slide files and manual concept pair annotations over 322 topics.
CITATION STYLE
Li, I., Fabbri, A., Hingmire, S., & Radev, D. (2020). R-VGAE: Relational-variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 1147–1157). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.99
Mendeley helps you to discover research relevant for your work.