Predicting Students' Attention Level with Interpretable Facial and Head Dynamic Features in an Online Tutoring System

14Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

Engaged learners are effective learners. Even though it is widely recognized that engagement plays a vital role in learning effectiveness, engagement remains to be an elusive psychological construct that is yet to find a consensus definition and reliable measurement. In this study, we attempted to discover the plausible operational definitions of engagement within an online learning context. We achieved this goal by first deriving a set of interpretable features on dynamics of eyes, head and mouth movement from facial landmarks extractions of video recording when students interacting with an online tutoring system. We then assessed their predicative value for engagement which was approximated by synchronized measurements from commercial EEG brainwave headset worn by students. Our preliminary results show that those features reduce root mean-squared error by 29% compared with default predictor and we found that the random forest model performs better than a linear regressor.

Cite

CITATION STYLE

APA

Peng, S., Chen, L., Gao, C., & Tong, R. J. (2020). Predicting Students’ Attention Level with Interpretable Facial and Head Dynamic Features in an Online Tutoring System. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 13895–13896). AAAI press.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free