There have been several research activities on data visualization exploiting augmented reality technologies. However, most researches are focused on tracking and visualization itself, yet do not much discuss social community with augmented reality. In this paper, we propose a social augmented reality architecture that selectively visualizes sensor information based on the user's social network community. We show three scenarios: information from sensors embedded in mobile devices, from sensors in environment, and from social community. We expect that the proposed architecture will have a crucial role in visualizing thousands of sensor data selectively according to the user's social network community. © 2011 Springer-Verlag.
CITATION STYLE
Lee, Y., Choi, J., Kim, S., Lee, S., & Jang, S. (2011). Social augmented reality for sensor visualization in ubiquitous virtual reality. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6773 LNCS, pp. 69–75). https://doi.org/10.1007/978-3-642-22021-0_9
Mendeley helps you to discover research relevant for your work.