Generating Hand Posture and Motion Dataset for Hand Pose Estimation in Egocentric View

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Hand interaction is one of the main input modalities for augmented reality glasses. Vision-based approaches using deep learning have been applied to hand tracking and shown good results. To train a deep neural network, a large dataset of hand information is required. However, obtaining real hand data is painful due to a large number of annotations and lack of diversities such as skins, lighting conditions, and backgrounds. In this paper, we propose a method to generate a synthetic hand dataset that includes diverse human and environmental parameters. By applying constraints of a human hand, we can get realistic hand poses for hand dataset. We also generate dynamic hand animations which can be used for hand gesture recognition.

Cite

CITATION STYLE

APA

Park, H., Kim, D., Yim, S., Kwon, T., Jeong, J., Lee, W., … Lee, G. (2022). Generating Hand Posture and Motion Dataset for Hand Pose Estimation in Egocentric View. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13317 LNCS, pp. 329–337). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-05939-1_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free