Virtual Reality (VR) technology enables "embodied interactions"in realistic environments where users can freely move and interact, with deep physical and emotional states. However, a comprehensive understanding of the embodied user experience is currently limited by the extent to which one can make relevant observations, and the accuracy at which observations can be interpreted. Paul Dourish proposed a way forward through the characterisation of embodied interactions in three senses: ontology, intersubjectivity, and intentionality. In a joint effort between computer and neuro-scientists, we built a framework to design studies that investigate multimodal embodied experiences in VR, and apply it to study the impact of simulated low-vision on user navigation. Our methodology involves the design of 3D scenarios annotated with an ontology, modelling intersubjective tasks, and correlating multimodal metrics such as gaze and physiology to derive intentions. We show how this framework enables a more fine-grained understanding of embodied interactions in behavioural research.
CITATION STYLE
Robert, F., Wu, H. Y., Sassatelli, L., Ramanoel, S., Gros, A., & Winckler, M. (2023). An Integrated Framework for Understanding Multimodal Embodied Experiences in Interactive Virtual Reality. In IMX 2023 - Proceedings of the 2023 ACM International Conference on Interactive Media Experiences (pp. 14–26). Association for Computing Machinery, Inc. https://doi.org/10.1145/3573381.3596150
Mendeley helps you to discover research relevant for your work.