AnyGesture: Arbitrary One-Handed Gestures for Augmented, Virtual, and Mixed Reality Applications

11Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

Abstract

Natural user interfaces based on hand gestures are becoming increasingly popular. The need for expensive hardware left a wide range of interaction possibilities that hand tracking enables largely unexplored. Recently, hand tracking has been built into inexpensive and widely available hardware, allowing more and more people access to this technology. This work provides researchers and users with a simple yet effective way to implement various one-handed gestures to enable deeper exploration of gesture-based interactions and interfaces. To this end, this work provides a framework for design, prototyping, testing, and implementation of one-handed gestures. The proposed framework was implemented with two main goals: First, it should be able to recognize any one-handed gesture. Secondly, the design and implementation of gestures should be as simple as performing the gesture and pressing a button to record it. The contribution of this paper is a simple yet unique way to record and recognize static and dynamic one-handed gestures. A static gesture can be captured with a template matching approach, while dynamic gestures use previously captured spatial information. The presented approach was evaluated in a user study with 33 participants and the implementable gestures received high accuracy and user acceptance.

Cite

CITATION STYLE

APA

Schäfer, A., Reis, G., & Stricker, D. (2022). AnyGesture: Arbitrary One-Handed Gestures for Augmented, Virtual, and Mixed Reality Applications. Applied Sciences (Switzerland), 12(4). https://doi.org/10.3390/app12041888

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free