Deep Knowledge Tracing with Transformers

48Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this work, we propose a Transformer-based model to trace students’ knowledge acquisition. We modified the Transformer structure to utilize 1) the association between questions and skills and 2) the elapsed time between question steps. The use of question-skill associations allows the model to learn specific representation for frequently encountered questions while representing rare questions with their underline skill representations. The inclusion of elapsed time opens the opportunity to address forgetting. Our approach outperforms the state-of-the-art methods in the literature by roughly 10% in AUC with frequently used public datasets.

Cite

CITATION STYLE

APA

Pu, S., Yudelson, M., Ou, L., & Huang, Y. (2020). Deep Knowledge Tracing with Transformers. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12164 LNAI, pp. 252–256). Springer. https://doi.org/10.1007/978-3-030-52240-7_46

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free