Thermal-Visible Video Fusion for Moving Target Tracking and Pedestrian Motion Analysis and Classification

  • Ran Y
  • Leykin A
  • Hammoud R
N/ACitations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This chapter presents a novel system for pedestrian surveillance, including tasks such as detection, tracking, classification, and possibly activity analysis. The system we propose first builds a background model as a multimodal distribution of colors and temperatures. It then constructs a particle filter scheme that makes a number of informed reversible transformations to sample the model probability space to maximize posterior probability of the scene model. Observation likelihoods of moving objects account their three-dimensional locations with respect to the camera and occlusions by other tracked objects as well as static obstacles. After capturing the coordinates and dimensions of moving objects, we apply a classifier based on periodic gait analysis. To differentiate humans from other moving objects such as cars, we detect a symmetrical double-helical pattern in human gait. Such pattern can then be analyzed using the frieze group theory. The results of tracking in color and thermal sequences demonstrate that our algorithm is robust to illumination change and performs well in outdoor environments.

Cite

CITATION STYLE

APA

Ran, Y., Leykin, A., & Hammoud, R. (2009). Thermal-Visible Video Fusion for Moving Target Tracking and Pedestrian Motion Analysis and Classification. In Augmented Vision Perception in Infrared (pp. 349–369). Springer London. https://doi.org/10.1007/978-1-84800-277-7_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free