Fusing time-of-flight depth and color for real-time segmentation and tracking

66Citations
Citations of this article
70Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present an improved framework for real-time segmentation and tracking by fusing depth and RGB color data. We are able to solve common problems seen in tracking and segmentation of RGB images, such as occlusions, fast motion, and objects of similar color. Our proposed real-time mean shift based algorithm outperforms the current state of the art and is significantly better in difficult scenarios. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Bleiweiss, A., & Werman, M. (2009). Fusing time-of-flight depth and color for real-time segmentation and tracking. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5742 LNCS, pp. 58–69). https://doi.org/10.1007/978-3-642-03778-8_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free