Learning to Look at Humans

  • Walther T
  • Würtz R
N/ACitations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The problem of learning a generalisable model of the visual appearance of humans from video data is of major importance for computing systems interacting naturally with their users and other humans populating their environment. We propose a step towards automatic behaviour understanding by integrating principles of Organic Computing into the posture estimation cycle, thereby relegating the need for human intervention while simultaneously raising the level of system autonomy. The system extracts coherent motion from moving upper bodies and autonomously decides about limbs and their possible spatial relationships. The models from many videos are integrated into meta-models, which show good generalisation to different individuals, backgrounds, and attire. These models even allow robust interpretation of single video frames, where all temporal continuity is missing.

Cite

CITATION STYLE

APA

Walther, T., & Würtz, R. P. (2011). Learning to Look at Humans. In Organic Computing — A Paradigm Shift for Complex Systems (pp. 309–322). Springer Basel. https://doi.org/10.1007/978-3-0348-0130-0_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free