Affect detection from multichannel physiology during learning sessions with autotutor

73Citations
Citations of this article
91Readers
Mendeley users who have this article in their library.
Get full text

Abstract

It is widely acknowledged that learners experience a variety of emotions while interacting with Intelligent Tutoring Systems (ITS), hence, detecting and responding to emotions might improve learning outcomes. This study uses machine learning techniques to detect learners' affective states from multichannel physiological signals (heart activity, respiration, facial muscle activity, and skin conductivity) during tutorial interactions with AutoTutor, an ITS with conversational dialogues. Learners were asked to self-report (both discrete emotions and degrees of valence/arousal) the affective states they experienced during their sessions with AutoTutor via a retrospective judgment protocol immediately after the tutorial sessions. In addition to mapping the discrete learning-centered emotions (e.g., confusion, frustration, etc) on a dimensional valence/arousal space, we developed and validated an automatic affect classifier using physiological signals. Results indicate that the classifier was moderately successful at detecting naturally occurring emotions during the AutoTutor sessions. © 2011 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Hussain, M. S., Alzoubi, O., Calvo, R. A., & D’Mello, S. K. (2011). Affect detection from multichannel physiology during learning sessions with autotutor. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6738 LNAI, pp. 131–138). https://doi.org/10.1007/978-3-642-21869-9_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free