GESTURE-BASED HUMAN-COMPUTER INTERACTION

  • N. Meghana
  • K. Sri Lakshmi
  • M. Naga Lakshmi Tejasree
  • et al.
N/ACitations
Citations of this article
11Readers
Mendeley users who have this article in their library.

Abstract

This project explores touchless interaction with computers using gesture recognition technology. By combining computer vision and gesture recognition, users can control their computers using hand gestures. The system uses a webcam for input, processes images with OpenCV, tracks hand movements with MediaPipe, and controls mouse and keyboard actions with PyAutoGUI. Users can perform various gestures like clicks, scrolls, and custom Open Paint gesture to control their computer. The project integrates with a paint application, allowing users to create digital art using their hand gestures, enhancing the overall user experience. Users can seamlessly control their computers by executing predefined gestures in the air, eliminating the limitations of conventional input device like mice. KEYWORDS - Gesture-based interaction, Human-Computer Interaction, MediaPipe, OpenCV, Paint Application, PyAutoGUI, Virtual mouse.

Cite

CITATION STYLE

APA

N. Meghana, K. Sri Lakshmi, M. Naga Lakshmi Tejasree, K.Srujana, & N.Ashok. (2023). GESTURE-BASED HUMAN-COMPUTER INTERACTION. EPRA International Journal of Research & Development (IJRD), 237–241. https://doi.org/10.36713/epra14757

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free