Multisensory immersive analytics

16Citations
Citations of this article
35Readers
Mendeley users who have this article in their library.
Get full text

Abstract

While visual cues are traditionally used for visual analytics, multimodal interaction technologies offer many new possibilities. This chapter explores the opportunities and challenges for developers and users to utilize and represent data through non-visual sensory channels to help them understand and interact with data. Users are able to experience data in new ways: variables from complex datasets can be conveyed through different senses; presentations are more accessible to people with vision impairment and can be personalized to specific user needs; interactions can involve multiple senses to provide natural and transparent methods. All these techniques enable users to obtain a better understanding of the underlying information. While the emphasis of this chapter is towards non-visual immersive analytics, we include a discussion on how visual presentations are integrated with different modalities, and the opportunities of mixing several sensory signals, including the visual domain.

Cite

CITATION STYLE

APA

McCormack, J., Roberts, J. C., Bach, B., Freitas, C. D. S., Itoh, T., Hurter, C., & Marriott, K. (2018). Multisensory immersive analytics. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11190 LNCS, pp. 57–94). Springer Verlag. https://doi.org/10.1007/978-3-030-01388-2_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free