We propose visual tracking over multiple temporal scales to handle occlusion and non-constant target motion. This is achieved by learning motion models from the target history at different temporal scales and applying those over multiple temporal scales in the future. These motion models are learned online in a computationally inexpensive manner. Reliable recovery of tracking after occlusions is achieved by extending the bootstrap particle filter to propagate particles at multiple temporal scales, possibly many frames ahead, guided by these motion models. In terms of the Bayesian tracking, the prior distribution at the current timestep is approximated by a mixture of the most probable modes of several previous posteriors propagated using their respectivemotionmodels. This improved and rich prior distribution, formed by the models learned and applied overmultiple temporal scales, further makes the proposed method robust to complex target motion through covering relatively large search space with reduced sampling effort. Extensive experiments have been carried out on both publicly available benchmarks and new video sequences. Results reveal that the proposed method successfully handles occlusions and a variety of rapid changes in target motion.
CITATION STYLE
Khan, M. H., Valstar, M. F., & Pridmore, T. P. (2015). MTS: A multiple temporal scale tracker handling occlusion and abrupt motion variation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9007, pp. 476–492). Springer Verlag. https://doi.org/10.1007/978-3-319-16814-2_31
Mendeley helps you to discover research relevant for your work.