This demo proposes to evoke subjects' emotions in a VR scene versus a computer screen scene and to test subjects' attentional bias to emotional face stimuli after receiving emotion evoking of different potencies. The study will also collect data on the time from the appearance of the target stimulus in the eye-movement task to the time when subjects take their gaze away from the distracting stimulus (emotional face) and look at the target stimulus as an indicator of emotional intensity, and compare the difference in the intensity of emotion evoking at high and low immersion levels to investigate the effect of VR immersion experiences on emotion evoking at different levels of pleasure and different levels of arousal. The results of this study can be further applied to robot-assisted instructional design to help students obtain appropriate emotional states for maximum efficiency in learning.
CITATION STYLE
Liu, F., Qin, J., Zhang, Z., Tang, K., Hu, J., & Zhou, Y. (2022). Eye Tracking Emotion Evoking Interaction Based on Go/No-Go Paradigm VR Demonstration. In Proceedings - VRCAI 2022: 18th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. Association for Computing Machinery, Inc. https://doi.org/10.1145/3574131.3574465
Mendeley helps you to discover research relevant for your work.