Procedurally generated virtual reality from 3D reconstructed physical space

118Citations
Citations of this article
152Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a novel system for automatically generating immersive and interactive virtual reality (VR) environments using the real world as a template. The system captures indoor scenes in 3D, detects obstacles like furniture and walls, and maps walkable areas (WA) to enable real-walking in the generated virtual environment (VE). Depth data is additionally used for recognizing and tracking objects during the VR experience. The detected objects are paired with virtual counterparts to leverage the physicality of the real world for a tactile experience. Our approach is new, in that it allows a casual user to easily create virtual reality worlds in any indoor space of arbitrary size and shape without requiring specialized equipment or training. We demonstrate our approach through a fully working system implemented on the Google Project Tango tablet device.

Cite

CITATION STYLE

APA

Sra, M., Garrido-Jurado, S., Schmandt, C., & Maes, P. (2016). Procedurally generated virtual reality from 3D reconstructed physical space. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST (Vol. 02-04-November-2016, pp. 191–200). Association for Computing Machinery. https://doi.org/10.1145/2993369.2993372

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free