Inferring learning from gaze data during interaction with an environment to support self-regulated learning

60Citations
Citations of this article
76Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we explore the potential of gaze data as a source of information to predict learning as students interact with MetaTutor, an ITS that scaffolds self-regulated learning. Using data from 47 college students, we show that a classifier using a variety of gaze features achieves considerable accuracy in predicting student learning after seeing gaze data from the complete interaction. We also show promising results on the classifier ability to detect learning in real-time during interaction. © 2013 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Bondareva, D., Conati, C., Feyzi-Behnagh, R., Harley, J. M., Azevedo, R., & Bouchet, F. (2013). Inferring learning from gaze data during interaction with an environment to support self-regulated learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7926 LNAI, pp. 229–238). Springer Verlag. https://doi.org/10.1007/978-3-642-39112-5_24

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free