Visual Tracking with Long-Short Term Based Correlation Filter

13Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Visual object tracking is a fundamental problem in computer vision, and has been greatly improved with the rapid development of deep learning. However, existing tracking methods with single model update strategy cannot guarantee the robustness of tracker in complex scenes. In this paper, we innovatively propose a novel real-Time long-short-Term multi-model based tracking method. For the fusion of long-Term and short-Term features contain more spatiotemporal information, three models with different update periods are designed to learn the long-Term and short-Term features to improve the tracking robustness. Besides, the hierarchical feature contain deep convolution features and handcraft features are used to represent the current object, which can further improve the tracking accuracy with richer semantic information. Finally, to solve the inaccurate prediction of object position due to the cosine window in the correlation filter, the bounding-box regression strategy is introduced to optimize the final object position. Extensive experiments on OTB, VOT, TC128, and UAV123 datasets demonstrate that the proposed method performs favorably against state-of-The-Art algorithms while running at 24 fps.

Cite

CITATION STYLE

APA

Yang, Y., Xing, W., Zhang, S., Gao, L., Yu, Q., Che, X., & Lu, W. (2020). Visual Tracking with Long-Short Term Based Correlation Filter. IEEE Access, 8, 20257–20269. https://doi.org/10.1109/ACCESS.2020.2968125

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free