Birdbox: Exploring the User Experience of Crossmodal, Multisensory Data Representations

1Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

Abstract

We contribute to an improved understanding of how physical multisensory data representations are experienced and how specific modalities affect the user experience (UX). We investigate how people make sense of Birdbox, a crossmodal data representation that employs combined haptic-Audio, audio-visual, or visual-haptic output for data about birds. Findings indicate that participants preferred haptic output for the bodily experience it triggered. Participants further created their own mappings between data and modality; haptic was mapped to aggression, and audio to speed. Especially with (soft) haptic output, Birdbox was experienced as a living entity. This can also be seen in participants' bodily interactions, holding Birdbox as if it were a small bird. We contribute to a better understanding of the UX of different modalities in multisensory data representations, highlight strengths of the haptic modality, and of metaphorical understandings of modalities.

Cite

CITATION STYLE

APA

Dodani, A., Van Koningsbruggen, R., & Hornecker, E. (2022). Birdbox: Exploring the User Experience of Crossmodal, Multisensory Data Representations. In ACM International Conference Proceeding Series (pp. 12–21). Association for Computing Machinery. https://doi.org/10.1145/3568444.3568455

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free