The performance and accuracy of computer vision systems are affected by noise in different forms. Although numerous solutions and algorithms have been presented for dealing with every type of noise, a comprehensive technique that can cover all the diverse noises and mitigate their damaging effects on the performance and precision of various systems is still missing. In this paper, we have focused on the stability and robustness of one computer vision branch (i.e., visual object tracking). We have demonstrated that, without imposing a heavy computational load on a model or changing its algorithms, the drop in the performance and accuracy of a system when it is exposed to an unseen noise-laden test dataset can be prevented by simply applying the style transfer technique on the train dataset and training the model with a combination of these and the original untrained data. To verify our proposed approach, it is applied on a generic object tracker by using regression networks. This method's validity is confirmed by testing it on an exclusive benchmark comprising 50 image sequences, with each sequence containing 15 types of noise at five different intensity levels. The OPE curves obtained show a 40% increase in the robustness of the proposed object tracker against noise, compared to the other trackers considered.
CITATION STYLE
Amirkhani, A., Barshooi, A. H., & Ebrahimi, A. (2021). Enhancing the robustness of visual object tracking via style transfer. Computers, Materials and Continua, 70(1), 981–997. https://doi.org/10.32604/cmc.2022.019001
Mendeley helps you to discover research relevant for your work.