PalmGazer: Unimanual Eye-hand Menus in Augmented Reality

23Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

How can we design the user interfaces for augmented reality (AR) so that we can interact as simple, flexible and expressive as we can with smartphones in one hand? To explore this question, we propose PalmGazer as an interaction concept integrating eye-hand interaction to establish a singlehandedly operable menu system. In particular, PalmGazer is designed to support quick and spontaneous digital commands- such as to play a music track, check notifications or browse visual media - through our devised three-way interaction model: hand opening to summon the menu UI, eye-hand input for selection of items, and dragging gesture for navigation. A key aspect is that it remains always-accessible and movable to the user, as the menu supports meaningful hand and head based reference frames. We demonstrate the concept in practice through a prototypical mobile UI with application probes, and describe technique designs specifically-tailored to the application UI. A qualitative evaluation highlights the system's interaction benefits and drawbacks, e.g., that common 2D scroll and selection tasks are simple to operate, but higher degrees of freedom may be reserved for two hands. Our work contributes interaction techniques and design insights to expand AR's uni-manual capabilities.

Cite

CITATION STYLE

APA

Pfeuffer, K., Obernolte, J., Dietz, F., Mäkelä, V., Sidenmark, L., Manakhov, P., … Alt, F. (2023). PalmGazer: Unimanual Eye-hand Menus in Augmented Reality. In Proceedings - SUI 2023: ACM Symposium on Spatial User Interaction. Association for Computing Machinery, Inc. https://doi.org/10.1145/3607822.3614523

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free