We describe a wearable audio conferencing and information presentation system that represents individual participants and audio elements through dynamic, visual abstractions, presented on a tracked, see-through head-worn display. Our interest is in communication spaces, annotation, and data that are represented by auditory media with synchronistic or synesthetic visualizations. Representations can transition between different spatial modalities as audio elements enter and exit the wearer’s physical presence. In this chapter, we discuss the user interface and infrastructure, SoundSight, which uses the Skype Internet telephony API to support wireless conferencing, and describe our early experience using the system.
CITATION STYLE
White, S., & Feiner, S. (2011). Dynamic, Abstract Representations of Audio in a Mobile Augmented Reality Conferencing System. In Recent Trends of Mobile Collaborative Augmented Reality Systems (pp. 149–160). Springer New York. https://doi.org/10.1007/978-1-4419-9845-3_12
Mendeley helps you to discover research relevant for your work.