Virtual flying experience contents using upper-body gesture recognition

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper, we describe an algorithm and an interactive content using the idea to experience feeling of bird's flying by using gesture recognition of a user's upper body. In the algorithm we assume that gesture is composed of several key poses. So, in order to recognize the user's gesture, we firstly classify the user pose into the several predefined key poses and then analyze the sequence of the poses. In the key pose recognition procedure, the information of upper-body configuration is estimated by using joint locations of depth image from a Kinect camera. If the user performs a consecutive motion, the content recognizes the key poses and then synthesizes a gesture according to the order of the key poses. The stage of the content is consisted with three parts in order to enjoy the various flight experiences. © Springer-Verlag Berlin Heidelberg 2013.

Cite

CITATION STYLE

APA

Park, J. wan, Oh, C. min, & Lee, C. woo. (2013). Virtual flying experience contents using upper-body gesture recognition. In Communications in Computer and Information Science (Vol. 373, pp. 367–371). Springer Verlag. https://doi.org/10.1007/978-3-642-39473-7_74

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free