Tracking objects in environments under non-uniform illumination condition is particularly challenging as the observed appearance may change in space and time. Adapting the appearance model increases the risk of drifts, while iltering out the illumination information through built-in invariance reduces the discriminative capabilities. In this work we adhere to color constancy principles to learn the appearance variation induced by non-uniform illumination and we use this information to perform location-dependent color corrections to boost tracking performance. The training procedure is carried out in an unsupervised manner by exploiting walking people as illumination probes and an online, non-parametric regression method is developed to densely predict the location-specific color transformations.
CITATION STYLE
Mutlu, S., Buló, S. R., & Lanz, O. (2014). Exploiting color constancy for robust tracking under non-uniform illumination. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8815, pp. 403–410). Springer Verlag. https://doi.org/10.1007/978-3-319-11755-3_45
Mendeley helps you to discover research relevant for your work.