New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition

7Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

Abstract

In recent years, researchers have been focusing on developing Human-Computer Interfaces that are fast, intuitive, and allow direct interaction with the computing environment. One of the most natural ways of communication is hand gestures. In this context, many systems were developed to recognize hand gestures using numerous vision-based techniques, these systems are highly affected by acquisition constraints, such as resolution, noise, lighting condition, hand shape, and pose. To enhance the performance under such constraints, we propose a static and dynamic hand gesture recognition system, which utilizes the Dual-Tree Complex Wavelet Transform to produce an approximation image characterized by less noise and redundancy. Subsequently, the Histogram of Oriented Gradients is applied to the resulting image to extract relevant information and produce a compact features vector. For classification, we compare the performance of three Artificial Neural Networks, namely, MLP, PNN, and RBNN. Random Decision Forest and SVM classifiers are also used to ameliorate the efficiency of our system. Experimental evaluation is performed on four datasets composed of alphabet signs and dynamic gestures. The obtained results demonstrate the efficiency of the combined features, for which the achieved recognition rates were comparable to the state-of-the-art.

Cite

CITATION STYLE

APA

Agab, S. E., & Chelali, F. Z. (2023). New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition. Multimedia Tools and Applications, 82(17), 26379–26409. https://doi.org/10.1007/s11042-023-14433-x

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free