Gesture Knitter: A Hand Gesture Design Tool for Head-Mounted Mixed Reality Applications

33Citations
Citations of this article
50Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Hand gestures are a natural and expressive input method enabled by modern mixed reality headsets. However, it remains challenging for developers to create custom gestures for their applications. Conventional strategies to bespoke gesture recognition involve either hand-crafting or data-intensive deep-learning. Neither approach is well suited for rapid prototyping of new interactions. This paper introduces a fexible and efcient alternative approach for constructing hand gestures. We present Gesture Knitter: a design tool for creating custom gesture recognizers with minimal training data. Gesture Knitter allows the specifcation of gesture primitives that can then be combined to create more complex gestures using a visual declarative script. Designers can build custom recognizers by declaring them from scratch or by providing a demonstration that is automatically decoded into its primitive components. Our developer study shows that Gesture Knitter achieves high recognition accuracy despite minimal training data and delivers an expressive and creative design experience.

Cite

CITATION STYLE

APA

Mo, G. B., Dudley, J. J., & Kristensson, P. O. (2021). Gesture Knitter: A Hand Gesture Design Tool for Head-Mounted Mixed Reality Applications. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3411764.3445766

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free