3D gesture-based view manipulator for large scale entity model review

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Hand gesture-based Human Computer Interaction (HCI) is one of the most natural and intuitive methods of communication between humans and machines because it closely mimics how humans interact with each other. Its intuitiveness and naturalness are needed to explore extensive and complex data or virtual realities. We developed a 3D gesture interface to manipulate the display of a 3D entity model. For gesture recognition, we use the Kinect as a depth sensor to acquire depth image frames. We track the position of the user's skeleton in each frame and detect preset gestures. By simple gestures, the user can pan, zoom, rotate, and reset the view and freely navigate inside the 3D entity model in the virtual space. The proposed gesture interface is integrated with the stereoscopic 3D model viewer that we have previously developed for 3D model review. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Park, H. J., Park, J., & Kim, M. H. (2012). 3D gesture-based view manipulator for large scale entity model review. In Communications in Computer and Information Science (Vol. 323 CCIS, pp. 524–533). https://doi.org/10.1007/978-3-642-34384-1_62

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free