Texture dissimilarity measures for background change detection

1Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Presented framework provides a method for adaptive background change detection in video from monocular static cameras. A background change constitutes of objects left in the scene and objects moved or taken from the scene. This framework may be applied to luggage left behind in public places, to asses the damage and theft of public property, or to detect minute changes in the scene. The key elements of the framework include spatiotemporal motion detection, texture classification of non-moving regions, and spatial clustering of detected background changes. Motion detection based on local variation of spatiotemporal texture separates the foreground and background regions. Local background dissimilarity measurement is based on wavelet decomposition of localized texture maps. Dynamic threshold of the normalized dissimilarity measurement identifies changed local background blocks, and spatial clustering isolates the regions of interest. The results are demonstrated on the PETS 2006 video sequences. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Miezianko, R., & Pokrajac, D. (2008). Texture dissimilarity measures for background change detection. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5112 LNCS, pp. 680–687). https://doi.org/10.1007/978-3-540-69812-8_67

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free