Deep-learning methods for hand-gesture recognition using ultra-wideband radar

61Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Using deep-learning techniques for analyzing radar signatures has opened new possibilities in the field of smart-sensing, especially in the applications of hand-gesture recognition. In this paper, we present a framework, using deep-learning techniques, to classify hand-gesture signatures generated from an ultra-wideband (UWB) impulse radar. We extract the signals of 14 different hand-gestures and represent each signature as a 3-dimensional tensor consisting of range-Doppler frame sequence. These signatures are passed to a convolutional neural network (CNN) to extract the unique features of each gesture, and are then fed to a classifier. We compare 4 different classification architectures to predict the gesture class, namely; (i) fully connected neural network (FCNN), (ii) k-Nearest Neighbours (k-NN), (iii) support vector machine (SVM), (iv) long short term memory (LSTM) network. The shape of the range-Doppler-frame tensor and the parameters of the classifiers are optimized in order to maximize the classification accuracy. The classification results of the proposed architectures show a high level of accuracy above 96 % and a very low confusion probability even between similar gestures.

Cite

CITATION STYLE

APA

Skaria, S., Al-Hourani, A., & Evans, R. J. (2020). Deep-learning methods for hand-gesture recognition using ultra-wideband radar. IEEE Access, 8, 203580–203590. https://doi.org/10.1109/ACCESS.2020.3037062

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free