Hand Gesture Detection for Sign Language using Neural Network with Mediapipe

  • Alvin A
  • Shabrina N
  • Ryo A
  • et al.
N/ACitations
Citations of this article
37Readers
Mendeley users who have this article in their library.

Abstract

The most popular way of interfacing with most computer systems is a mouse and keyboard. Hand gestures are an intuitive and effective touchless way to interact with computer systems. However, hand gesture-based systems have seen low adoption among end-users primarily due to numerous technical hurdles in detecting in-air gestures accurately. This paper presents Hand Gesture Detection for American Sign Language using K-Nearest Neighbor with Mediapipe, a framework developed to bridge this gap. The framework learns to detect gestures from demonstrations, it is customizable by end-users, and enables users to interact in real-time with computers having only RGB cameras, using gestures.

Cite

CITATION STYLE

APA

Alvin, A., Shabrina, N. H., Ryo, A., & Christian, E. (2021). Hand Gesture Detection for Sign Language using Neural Network with Mediapipe. Ultima Computing : Jurnal Sistem Komputer, 13(2), 57–62. https://doi.org/10.31937/sk.v13i2.2109

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free