Hand interaction is one of the main input modalities for augmented reality glasses. Vision-based approaches using deep learning have been applied to hand tracking and shown good results. To train a deep neural network, a large dataset of hand information is required. However, obtaining real hand data is painful due to a large number of annotations and lack of diversities such as skins, lighting conditions, and backgrounds. In this paper, we propose a method to generate a synthetic hand dataset that includes diverse human and environmental parameters. By applying constraints of a human hand, we can get realistic hand poses for hand dataset. We also generate dynamic hand animations which can be used for hand gesture recognition.
CITATION STYLE
Park, H., Kim, D., Yim, S., Kwon, T., Jeong, J., Lee, W., … Lee, G. (2022). Generating Hand Posture and Motion Dataset for Hand Pose Estimation in Egocentric View. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13317 LNCS, pp. 329–337). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-05939-1_22
Mendeley helps you to discover research relevant for your work.