Multiclass Emotion Classification Using Pupil Size in VR: Tuning Support Vector Machines to Improve Performance

8Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Emotion recognition and classification has become a popular topic of research among the area of computer science. In this paper, we present on the emotion classification approach using eye-tracking data solely with machine learning in Virtual Reality (VR). The emotions were classified into four distinct classes according to the Circumplex Model of Affects. The emotional stimuli used for this experiment is 360° videos presented in VR with four sessions stimulation according to the respective quadrant of emotions. Eye-tracking data is recorded using an eye-tracker and pupil diameter was chosen as a single modality feature for this investigation. The classifier used in this experiment was Support Vector Machine (SVM). The best accuracy is obtained from tuning the parameter in SVM and the best accuracy achieved was 57.65%.

Cite

CITATION STYLE

APA

Zheng, L. J., Mountstephens, J., & Wi, J. T. T. (2020). Multiclass Emotion Classification Using Pupil Size in VR: Tuning Support Vector Machines to Improve Performance. In Journal of Physics: Conference Series (Vol. 1529). Institute of Physics Publishing. https://doi.org/10.1088/1742-6596/1529/5/052062

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free