Dynamic Hand Gesture Recognition for Smart Lifecare Routines via K-Ary Tree Hashing Classifier

24Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

In the past few years, home appliances have been influenced by the latest technologies and changes in consumer trends. One of the most desired gadgets of this time is a universal remote control for gestures. Hand gestures are the best way to control home appliances. This paper presents a novel method of recognizing hand gestures for smart home appliances using imaging sensors. The proposed model is divided into six steps. First, preprocessing is done to de-noise the video frames and resize each frame to a specific dimension. Second, the hand is detected using a single shot detector-based convolution neural network (SSD-CNN) model. Third, landmarks are localized on the hand using the skeleton method. Fourth, features are extracted based on point-based trajectories, frame differencing, orientation histograms, and 3D point clouds. Fifth, features are optimized using fuzzy logic, and last, the H-Hash classifier is used for the classification of hand gestures. The system is tested on two benchmark datasets, namely, the IPN hand dataset and Jester dataset. The recognition accuracy on the IPN hand dataset is 88.46% and on Jester datasets is 87.69%. Users can control their smart home appliances, such as television, radio, air conditioner, and vacuum cleaner, using the proposed system.

Cite

CITATION STYLE

APA

Ansar, H., Ksibi, A., Jalal, A., Shorfuzzaman, M., Alsufyani, A., Alsuhibany, S. A., & Park, J. (2022). Dynamic Hand Gesture Recognition for Smart Lifecare Routines via K-Ary Tree Hashing Classifier. Applied Sciences (Switzerland), 12(13). https://doi.org/10.3390/app12136481

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free