Dynamic, Abstract Representations of Audio in a Mobile Augmented Reality Conferencing System

  • White S
  • Feiner S
N/ACitations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We describe a wearable audio conferencing and information presentation system that represents individual participants and audio elements through dynamic, visual abstractions, presented on a tracked, see-through head-worn display. Our interest is in communication spaces, annotation, and data that are represented by auditory media with synchronistic or synesthetic visualizations. Representations can transition between different spatial modalities as audio elements enter and exit the wearer’s physical presence. In this chapter, we discuss the user interface and infrastructure, SoundSight, which uses the Skype Internet telephony API to support wireless conferencing, and describe our early experience using the system.

Cite

CITATION STYLE

APA

White, S., & Feiner, S. (2011). Dynamic, Abstract Representations of Audio in a Mobile Augmented Reality Conferencing System. In Recent Trends of Mobile Collaborative Augmented Reality Systems (pp. 149–160). Springer New York. https://doi.org/10.1007/978-1-4419-9845-3_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free