Our aim was to determine the possibility of detecting cognitive emotion information (neutral, disgust, shameful, "sensory pleasure") by using a remote eye tracker within an approximate range of 1 meter. Our implementation was based on a self-learning ANN used for profile building, emotion status identification and recognition. Participants of the experiment were provoked with audiovisual stimuli (videos with sounds) to measure the emotional feedback. The proposed system was able to classify each felt emotion with an average of 90% accuracy (2 second measuring interval).
CITATION STYLE
Maskeliunas, R., & Raudonis, V. (2016). Are you ashamed? Can a gaze tracker tell? PeerJ Computer Science, 2016(8). https://doi.org/10.7717/peerj-cs.75
Mendeley helps you to discover research relevant for your work.