Predicting Prerequisite Relations for Unseen Concepts

4Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

Abstract

Concept prerequisite learning (CPL) plays a key role in developing technologies that assist people to learn a new complex topic or concept. Previous work commonly assumes that all concepts are given at training time and solely focuses on predicting the unseen prerequisite relationships between them. However, many real-world scenarios deal with concepts that are left undiscovered at training time, which is relatively unexplored. This paper studies this problem and proposes a novel alternating knowledge distillation approach to take advantage of both content- and graph-based models for this task. Extensive experiments on three public benchmarks demonstrate up to 10% improvements in terms of F1 score.

Cite

CITATION STYLE

APA

Zhu, Y., & Zamani, H. (2022). Predicting Prerequisite Relations for Unseen Concepts. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 (pp. 8542–8548). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.emnlp-main.585

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free