Human Computer Interaction Based on Hand Gesture

  • Patel* J
  • et al.
N/ACitations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Many hand-controlled robots are developed for visually impaired people in order to make them live confidently. This project work proposes a Human Computer Interactions with the help of gestures recognition wireless to help physically handicapped persons to move robot in desired direction lives. The project work is framed into three stages. First, gesture capturing and recognition – gesture capturing uses a laptop or pc camera that takes input from our hands and gesture recognition based on the finger count algorithm. Secondly, Transmission of data wireless – ZigBEE Module is used for serial transmission of data, Finally, Movement of Robot - The robot will move based on the fingers opened or fingers closed and displays the direction in laptop or pc in which direction the robot is moving. This project work can be able to insist the physically enabled people in their daily life. The entire process will run on Arduino Uno, ZigBEE Module, L293D Motor Driver

Cite

CITATION STYLE

APA

Patel*, J., & U., S. A. (2020). Human Computer Interaction Based on Hand Gesture. International Journal of Innovative Technology and Exploring Engineering, 9(5), 895–898. https://doi.org/10.35940/ijitee.d1345.039520

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free