A trajectory-based approach for device independent gesture recognition in multimodal user interfaces

3Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With the rise of technology in all areas of life new interaction techniques are required. With gestures and voice being the most natural ways to interact, it is a goal to also support this in human-computer interaction. In this paper, we introduce our approach to multimodal interaction in smart home environments and illustrate how device independent gesture recognition can be of great support in this area. We describe a trajectory-based approach that is applied to support device independent dynamic hand gesture recognition from vision systems, accelerometers or pen devices. The recorded data from the different devices is transformed to a common basis (2D-space) and the feature extraction and recognition is done on this basis. In a comprehensive case study we show the feasibility of the recognition and the integration with a multimodal and adaptive home operating system. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Wilhelm, M., Roscher, D., Blumendorf, M., & Albayrak, S. (2010). A trajectory-based approach for device independent gesture recognition in multimodal user interfaces. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6306 LNCS, pp. 197–206). https://doi.org/10.1007/978-3-642-15841-4_21

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free