Abstract
With AR/VR devices becoming increasingly common around us, their user authentication has posed a critical challenge. While typing passwords is straightforward with a keyboard, it has been cumbersome with conventional AR/VR input techniques such as in-air gestures and hand-held controllers. In this work, we developed a fluent authentication technique that allows AR/VR users to unlock their profiles with simple head gestures (e.g., nodding). This resembles the powerful "Slide to Unlock"interaction on touch screen devices. Specifically, we extract bio-features such as neck length and head radius from IMU sensor readings of these head gestures for user identification with machine learning. Though our approach is less strict compared with conventional password-based methods, we believe its swiftness greatly facilitates scenarios with frequent user switching (e.g., device sharing across team and family members) which demand quick authentications. Through a 10-participant evaluation, we demonstrated that our system is robust and accurate with an average accuracy of 97.1% on groups of 5, simulating family and lab use.
Author supplied keywords
Cite
CITATION STYLE
Wang, X., & Zhang, Y. (2021). Nod to Auth: Fluent AR/VR Authentication with User Head-Neck Modeling. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3411763.3451769
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.