Abstract
This paper introduces a multimodal approach for detecting individuals' affective state while being exposed to visual narratives. We use four modalities, namely visual facial behaviors, heart rate measurements, thermal imaging, and verbal descriptions, and show that we can predict changes in the affect that people experience when they are exposed to audio-visual stimuli (either positive or negative). We conduct experiments that aim to predict the presence of affective response while being exposed to visual narratives, as well as to distinguish between positive or negative affect valence. Extensive feature analyses and experiments to predict the presence of affect demonstrate how the four modalities we explore can effectively complement each other.
Cite
CITATION STYLE
Burzo, M., Perez-Rosas, V., McDuff, D., Morency, L. P., Narvaez, A., & Mihalcea, R. (2019). Sensing Affective Response to Visual Narratives. IEEE Computational Intelligence Magazine, 14(2), 54–66. https://doi.org/10.1109/MCI.2019.2901086
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.