Hand gesture detection with convolutional neural networks

12Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we present a method for locating and recognizing hand gestures from images, based on Deep Learning. Our goal is to provide an intuitive and accessible way to interact with Computer Vision-based mobile applications aimed to assist visually impaired people (e.g. pointing a finger at an object in a real scene to zoom in for a close-up of the pointed object). Initially, we have defined different hand gestures that can be assigned to different actions. After that, we have created a database containing images corresponding to these gestures. Lastly, this database has been used to train Neural Networks with different topologies (testing different input sizes, weight initialization, and data augmentation process). In our experiments, we have obtained high accuracies both in localization (96%–100%) and in recognition (99.45%) with Networks that are appropriate to be ported to mobile devices.

Cite

CITATION STYLE

APA

Alashhab, S., Gallego, A. J., & Lozano, M. Á. (2019). Hand gesture detection with convolutional neural networks. In Advances in Intelligent Systems and Computing (Vol. 800, pp. 45–52). Springer Verlag. https://doi.org/10.1007/978-3-319-94649-8_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free