How are strong positive affective states related to eye-tracking features and how can they be used to appropriately enhance well-being in multimedia consumption? In this paper, we propose a robust classification algorithm for predicting strong happy emotions from a large set of features acquired from wearable eye-tracking glasses. We evaluate the potential transferability across subjects and provide a model-agnostic interpretable feature importance metric. Our proposed algorithm achieves a true-positive-rate of 70% while keeping a low false-positive-rate of 10% with extracted features of the pupil diameter as most important features.
CITATION STYLE
Bethge, D., Chuang, L., & Grosse-Puppendahl, T. (2020). Analyzing Transferability of Happiness Detection via Gaze Tracking in Multimedia Applications. In Eye Tracking Research and Applications Symposium (ETRA). Association for Computing Machinery. https://doi.org/10.1145/3379157.3391655
Mendeley helps you to discover research relevant for your work.