In the past, augmented reality (AR) research focused mostly on visual augmentation, which requires a visual rendering device like head-mounted displays that are usually obtrusive, expensive, and socially unaccepted. In contrast, wearable audio headsets are already popularized and the auditory sense also plays an important role in everyday interactions with the environment. In this PhD project, we explore audio augmented reality (AAR) that augments objects with 3D sounds, which are spatialized virtually but are perceived as originating from real locations in the space. We intend to design, implement, and evaluate such AAR systems that enhance people’s intuitive and immersive interactions with objects in various consumer and industrial scenarios. By exploring AAR using pervasive and wearable devices, we hope to contribute to the vision of ubiquitous AR.
CITATION STYLE
Yang, J., & Mattern, F. (2019). Audio augmented reality for human-object interactions. In UbiComp/ISWC 2019- - Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers (pp. 408–412). Association for Computing Machinery, Inc. https://doi.org/10.1145/3341162.3349302
Mendeley helps you to discover research relevant for your work.