A Bayesian framework for multi-cue 3D object tracking

64Citations
Citations of this article
107Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper presents a Bayesian framework for multi-cue 3D object tracking of deformable objects. The proposed spatio-temporal object representation involves a set of distinct linear subspace models or Dynamic Point Distribution Models (DPDMs), which can deal with both continuous and discontinuous appearance changes; the representation is learned fully automatically from training data. The representation is enriched with texture information by means of intensity histograms, which are compared using the Bhattacharyya coefficient. Direct 3D measurement is furthermore provided by a stereo system. State propagation is achieved by a particle filter which combines the three cues shape, texture and depth, in its observation density function. The tracking framework integrates an independently operating object detection system by means of importance sampling. We illustrate the benefit of our integrated multi-cue tracking approach on pedestrian tracking from a moving vehicle. © Springer-Verlag Berlin Heidelberg 2004.

Cite

CITATION STYLE

APA

Giebel, J., Gavrila, D. M., & Schnörr, C. (2004). A Bayesian framework for multi-cue 3D object tracking. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3024, 241–252. https://doi.org/10.1007/978-3-540-24673-2_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free