Context matters: Refining object detection in video with recurrent neural networks

17Citations
Citations of this article
117Readers
Mendeley users who have this article in their library.

Abstract

Given the vast amounts of video available online and recent breakthroughs in object detection with static images, object detection in video offers a promising new frontier. However, motion blur and compression artifacts cause substantial frame-level variability, even in videos that appear smooth to the eye. Additionally, in video datasets, frames are typically sparsely annotated. We present a new framework for improving object detection in videos that captures temporal context and encourages consistency of predictions. First, we train a pseudo-labeler, i.e., a domain-adapted convolutional neural network for object detection, on the subset of labeled frames. We then subsequently apply it to provisionally label all frames, including those absent labels. Finally, we train a recurrent neural network that takes as input sequences of pseudo-labeled frames and optimizes an objective that encourages both accuracy on the target frame and consistency across consecutive frames. The approach incorporates strong supervision of target frames, weak-supervision on context frames, and regularization via a smoothness penalty. Our approach achieves mean Average Precision (mAP) of 68.73, an improvement of 7.1 over the strongest image-based baselines for the Youtube-Video Objects dataset. Our experiments demonstrate that neighboring frames can provide valuable information, even absent labels.

Cite

CITATION STYLE

APA

Tripathi, S., Lipton, Z. C., Belongie, S., & Nguyen, T. (2016). Context matters: Refining object detection in video with recurrent neural networks. In British Machine Vision Conference 2016, BMVC 2016 (Vol. 2016-September, pp. 1–12). British Machine Vision Conference, BMVC. https://doi.org/10.5244/C.30.44

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free