3D interaction through a real-time gesture search engine

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

3D gesture recognition and tracking are highly desired features of interaction design in future mobile and smart environments. Specifically, in virtual/augmented reality applications, intuitive interaction with the physical space seems unavoidable and 3D gestural interaction might be the most effective alternative for the current input facilities such as touchscreens. In this paper, we introduce a novel solution for real-time 3D gesture-based interaction by finding the best match from an extremely large gesture database. This database includes the images of various articulated hand gestures with the annotated 3D position/ orientation parameters of the hand joints. Our unique matching algorithm is based on the hierarchical scoring of the low-level edgeorientation features between the query frames and database and retrieving the best match. Once the best match is found from the database in each moment, the pre-recorded 3D motion parameters can instantly be used for natural interaction. The proposed bare-hand interaction technology performs in real-time with high accuracy using an ordinary camera.

Cite

CITATION STYLE

APA

Yousefi, S., & Li, H. (2015). 3D interaction through a real-time gesture search engine. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9009, pp. 199–213). Springer Verlag. https://doi.org/10.1007/978-3-319-16631-5_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free