Unfamiliar dynamic hand gestures recognition based on zero-shot learning

9Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Most existing robots can recognize trained hand gestures to interpret user’s intent, while untrained dynamic hand gestures are hard to be understood correctly. This paper presents a dynamic hand gesture recognition approach based on Zero-Shot Learning (ZSL), which can recognize untrained hand gestures and predict user’s intention. To this end, we utilize a Bidirectional Long-Short-Term Memory (BLSTM) network to extract hand gesture feature from skeletal joint data collected by Leap Motion Controller (LMC). Specifically, this data is used to construct a novel dynamic hand gesture dataset for human-robot interaction application. Twenty common hand gestures are included and fifteen concrete semantic attributes are condensed. Based on these features and semantic attributes, a Semantic Autoencoder (SAE) is employed to learn a mapping from feature space to semantic space. By matching the most similar semantic information, the unfamiliar hand gestures are recognized as correct as possible. Experimental results on our dataset indicate that the proposed approach can effectively identify unfamiliar hand gestures.

Cite

CITATION STYLE

APA

Wu, J., Li, K., Zhao, X., & Tan, M. (2018). Unfamiliar dynamic hand gestures recognition based on zero-shot learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11305 LNCS, pp. 244–254). Springer Verlag. https://doi.org/10.1007/978-3-030-04221-9_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free