Synchronized shared scene viewing in mixed VR devices in support of group collaboration

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Virtual Reality devices are available with different resolutions and fields of view. Users can simultaneously interact within environments on head mounted displays, cell phones, tablets, and PowerWalls. Sharing scenes across devices requires solutions that smoothly synchronize shared navigation, minimize jitter and avoid visual confusion. In this paper we present a system that allows a single user to remotely guide many remote users in a virtual environment. A variety of mixed device environments are supported. Techniques are implemented to minimize jitter and synchronize views.

Author supplied keywords

Cite

CITATION STYLE

APA

Cutchin, S., & Vazquez, I. (2016). Synchronized shared scene viewing in mixed VR devices in support of group collaboration. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9929 LNCS, pp. 348–352). Springer Verlag. https://doi.org/10.1007/978-3-319-46771-9_45

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free