3D Hand Joints Position Estimation with Graph Convolutional Networks: A GraphHands Baseline

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

State-of-the-art deep learning-based models used to address hand challenges, e.g. 3D hand joint estimation, need a vast amount of annotated data to achieve a good performance. The lack of data is a problem of paramount importance. Consequently, the use of synthetic datasets for training deep learning models is a trend and represents a promising avenue to improve existing approaches. Nevertheless, currently existing synthetic datasets lack of accurate and complete annotations, realism, and also rich hand-object interactions. For this purpose, in our work we present a synthetic dataset featuring rich hand-object interactions in photorealistic scenarios. The applications of our dataset for hand-related challenges are unlimited. To validate our data, we propose an initial approach to 3D hand joint estimation using a graph convolutional network feeded with point cloud data. Another point in favour of our dataset is that interactions are performed using realistic objects extracted from the YCB dataset. This could allow to test trained systems with our synthetic dataset using images/videos manipulating the same objects in real life.

Cite

CITATION STYLE

APA

Castro-Vargas, J. A., Garcia-Garcia, A., Oprea, S., Martinez-Gonzalez, P., & Garcia-Rodriguez, J. (2020). 3D Hand Joints Position Estimation with Graph Convolutional Networks: A GraphHands Baseline. In Advances in Intelligent Systems and Computing (Vol. 1093 AISC, pp. 551–562). Springer. https://doi.org/10.1007/978-3-030-36150-1_45

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free