Human-Computer Interaction Using Manual Hand Gestures in Real Time

4Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

This paper describes the construction of an electronic system that can recognise twelve manual motions made by an interlocutor with one of their hands in a situation with regulated lighting and background in real time. Hand rotations, translations, and scale changes in the camera plane are all supported by the implemented system. The system requires an Analog Devices ADSP BF-533 Ez-Kit Lite evaluation card. As a last stage in the development process, displaying a letter associated with a recognized gesture is advised. However, a visual representation of the suggested algorithm may be found in the visual toolbox of a personal computer. Individuals who are deaf or hard of hearing will communicate with the general population thanks to new technology that connects them to computers. This technology is being used to create new applications.

Cite

CITATION STYLE

APA

Alsaffar, M., Alshammari, A., Alshammari, G., Almurayziq, T. S., Aljaloud, S., Alshammari, D., & Belay, A. (2021). Human-Computer Interaction Using Manual Hand Gestures in Real Time. Computational Intelligence and Neuroscience, 2021. https://doi.org/10.1155/2021/6972192

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free