Eye Gaze-based Object Rotation for Head-mounted Displays

10Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Hands-free manipulation of 3D objects has long been a challenge for augmented and virtual reality (AR/VR). While many methods use eye gaze to assist with hand-based manipulations, interfaces cannot yet provide completely gaze-based 6 degree-of-freedom (DoF) manipulations in an efficient manner. To address this problem, we implemented three methods to handle rotations of virtual objects using gaze, including RotBar: a method that maps line-of-sight eye gaze onto per-axis rotations, RotPlane: a method that makes use of orthogonal planes to achieve per-axis angular rotations, and RotBall: a method that combines a traditional arcball with an external ring to handle user-perspective roll manipulations. We validated the efficiency of each method by conducting a user study involving a series of orientation tasks along different axes with each method. Experimental results showed that users could accomplish single-axis orientation tasks with RotBar and RotPlane significantly faster and more accurate than RotBall. On the other hand for multi-axis orientation tasks, RotBall significantly outperformed RotBar and RotPlane in terms of speed and accuracy.

Cite

CITATION STYLE

APA

Liu, C., Orlosky, J., & Plopski, A. (2020). Eye Gaze-based Object Rotation for Head-mounted Displays. In Proceedings - SUI 2020: ACM Symposium on Spatial User Interaction. Association for Computing Machinery, Inc. https://doi.org/10.1145/3385959.3418444

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free