Virtual Reality devices are available with different resolutions and fields of view. Users can simultaneously interact within environments on head mounted displays, cell phones, tablets, and PowerWalls. Sharing scenes across devices requires solutions that smoothly synchronize shared navigation, minimize jitter and avoid visual confusion. In this paper we present a system that allows a single user to remotely guide many remote users in a virtual environment. A variety of mixed device environments are supported. Techniques are implemented to minimize jitter and synchronize views.
CITATION STYLE
Cutchin, S., & Vazquez, I. (2016). Synchronized shared scene viewing in mixed VR devices in support of group collaboration. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9929 LNCS, pp. 348–352). Springer Verlag. https://doi.org/10.1007/978-3-319-46771-9_45
Mendeley helps you to discover research relevant for your work.