Extending the perceptual user interface to recognise movement

1Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Perceptual User Interfaces (PUIs) automatically extract user input from natural and implicit components of human activity such as gestures, direction of gaze, facial expression and body movement. This paper presents a Continuous Human Movement Recognition (CHMR) system for recognising a large range of specific movement skills from continuous 3D full-body motion. A new methodology defines an alphabet of dynemes, units of full-body movement skills, to enable recognition of diverse skills. Using multiple Hidden Markov Models, the CHMR system attempts to infer the movement skill that could have produced the observed sequence of dynemes. This approach enables the CHMR system to track and recognise hundreds of full-body movement skills from gait to twisting summersaults. This extends the perceptual user interface beyond frontal posing or only tracking one hand to recognise and understand full-body movement in terms of everyday activities. © Springer-Verlag Berlin Heidelberg 2004.

Cite

CITATION STYLE

APA

Green, R. (2004). Extending the perceptual user interface to recognise movement. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3101, 121–132. https://doi.org/10.1007/978-3-540-27795-8_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free