Multisensory physical environments for data representation

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper reviews theoretical research and projects in data representation that use different sensory modalities, embodiment, physical objects, and immersive environments. Other topics include the impact of cross-modal perception on data representation and the role audiovisual aesthetics play in the interpretation of data. Research has shown that cross-modal perception enhances sensory stimuli. Sound, touch, gesture, and movement engage the user and create holistic environments that provide multi-dimensional representations of complex data relationships. These data representations include data sculptures, ambient displays, and multisensory environments that use our intuitive abilities to process information from different sensory modalities. By using multiple senses, it is possible to increase the number of variables and relationships that can be represented simultaneously in complex data sets.

Cite

CITATION STYLE

APA

Search, P. (2016). Multisensory physical environments for data representation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9748, pp. 202–213). Springer Verlag. https://doi.org/10.1007/978-3-319-40406-6_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free