Efficient feature tracking for long video sequences

23Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This work is concerned with real-time feature tracking for long video sequences. In order to achieve efficient and robust tracking, we propose two interrelated enhancements to the well-known Shi-TomasiKanade tracker. Our first contribution is the integration of a linear illumination compensation method into the inverse compositional approach for affine motion estimation. The resulting algorithm combines the strengths of both components and achieves strong robustness and high efficiency at the same time. Our second enhancement copes with the feature drift problem, which is of special concern in long video sequences. Refining the initial frame-to-frame estimate of the feature position, our approach relies on the ability to robustly estimate the affine motion of every feature in every frame in real-time. We demonstrate the performance of our enhancements with experiments on real video sequences. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Zinßer, T., Gräßl, C., & Niemann, H. (2004). Efficient feature tracking for long video sequences. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3175, 326–333. https://doi.org/10.1007/978-3-540-28649-3_40

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free