Interpret human gestures with a time of flight camera using standard image processing algorithms on a distributed system

0Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The development of Human Computer Interfaces steadily moves away from peripheral devices like mouse and keyboard in certain areas, as is obvious when looking at the evolution of smart-phones, tablet-PCs and touch-enabled operating systems over the last few years. Nowadays we can even witness the transition from touch-based interfaces to touch-free interfaces. One common method to realize such interfaces is to incorporate new state-of-the art 3D cameras (often called "Time of Flight" cameras). The difficulty lies within the evaluation of the sensordata, to achieve robust detection and tracking of people within the scene in real-time.We try to solve this task without using expensive knowledgebased approaches by employing standard image-processing algorithms because we wanted to keep the required manpower and development time, as well as costs, as low as possible. © Springer-Verlag Berlin Heidelberg 2013.

Cite

CITATION STYLE

APA

Froemmer, B., Roeder, N., & Hergenroether, E. (2013). Interpret human gestures with a time of flight camera using standard image processing algorithms on a distributed system. In Communications in Computer and Information Science (Vol. 373, pp. 317–321). Springer Verlag. https://doi.org/10.1007/978-3-642-39473-7_64

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free