STMG: A Machine Learning Microgesture Recognition System for Supporting Thumb-Based VR/AR Input

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

AR/VR devices have started to adopt hand tracking, in lieu of controllers, to support user interaction. However, today's hand input rely primarily on one gesture: pinch. Moreover, current mappings of hand motion to use cases like VR locomotion and content scrolling involve more complex and larger arm motions than joystick or trackpad usage. STMG increases the gesture space by recognizing additional small thumb-based microgestures from skeletal tracking running on a headset. We take a machine learning approach and achieve a 95.1% recognition accuracy across seven thumb gestures performed on the index finger surface: four directional thumb swipes (left, right, forward, backward), thumb tap, and fingertip pinch start and pinch end. We detail the components to our machine learning pipeline and highlight our design decisions and lessons learned in producing a well generalized model. We then demonstrate how these microgestures simplify and reduce arm motions for hand-based locomotion and scrolling interactions.

Cite

CITATION STYLE

APA

Kin, K., Wan, C., Koh, K., Marin, A., Camgöz, N. C., Zhang, Y., … Ma, S. (2024). STMG: A Machine Learning Microgesture Recognition System for Supporting Thumb-Based VR/AR Input. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3613904.3642702

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free