Enhancing Virtual and Augmented Reality Interactions with a MediaPipe-Based Hand Gesture Recognition User Interface

6Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

Abstract

Interaction with virtual objects is crucial for presence in Virtual Reality (VR) and Augmented Reality (AR) applications. However, controllers are still predominantly used for operations in virtual spaces. Hand gestures offer a more intuitive approach than keyboards and mice for interactions in these environments. In previous research, hand motion classification was implemented using only simple heuristics. This study introduces a User Interface (UI) employing MediaPipe and artificial intelligence to utilize hand gestures as an input device. Unlike the previous research, which could only identify one gesture, the current implementation successfully classifies three gestures with 95.4% accuracy: pointer, pick, and fist. Efforts were made to optimize the process, including the examination of multi-threading and PyWin32. While multi-threading did not yield significant improvements, the use of PyWin32 resulted in approximately three times higher Frames Per Second (FPS) compared to the absence of PyWin32. Further gestures can potentially be added to enhance the system's capabilities. This line of research has potential applications in diverse fields such as gaming, simulation, rehabilitation, and smart home technology.

Cite

CITATION STYLE

APA

Jo, B. J., Kim, S. K., & Kim, S. K. (2023). Enhancing Virtual and Augmented Reality Interactions with a MediaPipe-Based Hand Gesture Recognition User Interface. Ingenierie Des Systemes d’Information, 28(3), 633–638. https://doi.org/10.18280/isi.280311

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free