Portable depth-sensing cameras allow users to control interfaces using hand gestures at a short range from the camera. These technologies are being combined with virtual reality (VR) headsets to produce immersive VR experiences that respond more naturally to user actions. In this research, we explore gesture-based interaction in immersive VR games by using the Unity game engine, the LeapMotion sensor, a laptop, a smartphone, and the Freefly VR headset. By avoiding Android deployment, this novel setup allowed for fast prototyping and testing of different ideas for immersive VR interaction, at an affordable cost. We implemented a system that allows users to play a game in a virtual world and compared placements of the leap motion sensor on the desk and on the headset. In this experimental setup, users interacted with a numeric dial panel and then played a Tetris game inside the VR environment by pressing the buttons of a virtual panel. The results suggest that, although the tracking quality of the Leap Motion sensor was rather limited when used in the head-mounted setup for pointing and selection tasks, its performance was much better in the desk-mounted setup, providing a novel platform for research and rapid application development.
CITATION STYLE
Zhang, Y., & Meruvia-Pastor, O. (2017). Operating virtual panels with hand gestures in immersive VR games: Experiences with the leap motion controller. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10324 LNCS, pp. 299–308). Springer Verlag. https://doi.org/10.1007/978-3-319-60922-5_24
Mendeley helps you to discover research relevant for your work.