PEM360: A dataset of 360° videos with continuous Physiological measurements, subjective Emotional ratings and Motion traces

12Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

From a user perspective, immersive content can elicit more intense emotions than flat-screen presentations. From a system perspective, efficient storage and distribution remain challenging, and must consider user attention. Understanding the connection between user attention, user emotions and immersive content is therefore key. In this article, we present a new dataset, PEM360 of user head movements and gaze recordings in 360° videos, along with self-reported emotional ratings of valence and arousal, and continuous physiological measurement of electrodermal activity and heart rate. The stimuli are selected to enable the spatiotemporal analysis of the connection between content, user motion and emotion. We describe and provide a set of software tools to process the various data modalities, and introduce a joint instantaneous visualization of user attention and emotion we name Emotional maps. We exemplify new types of analyses the PEM360 dataset can enable. The entire data and code are made available in a reproducible framework.

Cite

CITATION STYLE

APA

Guimard, Q., Robert, F., Bauce, C., Ducreux, A., Sassatelli, L., Wu, H. Y., … Gros, A. (2022). PEM360: A dataset of 360° videos with continuous Physiological measurements, subjective Emotional ratings and Motion traces. In MMSys 2022 - Proceedings of the 13th ACM Multimedia Systems Conference (pp. 252–258). Association for Computing Machinery, Inc. https://doi.org/10.1145/3524273.3532895

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free