HSFE network and fusion model based dynamic hand gesture recognition

4Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

Dynamic hand gesture recognition(d-HGR) plays an important role in human-computer interaction(HCI) system. With the growth of hand-pose estimation as well as 3D depth sensors, depth, and the hand-skeleton dataset is proposed to bring much research in depth and 3D hand skeleton approaches. However, it is still a challenging problem due to the low resolution, higher complexity, and self-occlusion. In this paper, we propose a hand-shape feature extraction(HSFE) network to produce robust hand-shapes. We build a hand-shape model, and hand-skeleton based on LSTM to exploit the temporal information from hand-shape and motion changes. Fusion between two models brings the best accuracy in dynamic hand gesture (DHG) dataset.

Cite

CITATION STYLE

APA

Tai, D. N., Na, I. S., & Kim, S. H. (2020). HSFE network and fusion model based dynamic hand gesture recognition. KSII Transactions on Internet and Information Systems, 14(9), 3924–3940. https://doi.org/10.3837/tiis.2020.09.020

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free