L-STAP: Learned spatio-temporal adaptive pooling for video captioning

5Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Automatic video captioning can be used to enrich TV programs with textual informations on scenes. These informations can be useful for visually impaired people, but can also be used to enhance indexing and research of TV records. Video captioning can be seen as being more challenging than image captioning. In both cases, we have to tackle a challenging task where a visual object has to be analyzed, and translated into a textual description in natural language. However, analyzing videos requires not only to parse still images, but also to draw correspondences through time. Recent works in video captioning have intended to deal with these issues by separating spatial and temporal analysis of videos. In this paper, we propose a Learned Spatio-Temporal Adaptive Pooling (L-STAP) method that combines spatial and temporal analysis. More specifically, we first process a video frame-by-frame through a Convolutional Neural Network. Then, instead of applying an average pooling operation to reduce dimensionality, we apply our L-STAP, which attends to specific regions in a given frame based on what appeared in previous frames. Experiments on MSVD and MSR-VTT datasets show that our method outperforms state-of-the-art methods on the video captioning task in terms of several evaluation metrics.

Cite

CITATION STYLE

APA

Francis, D., & Huet, B. (2019). L-STAP: Learned spatio-temporal adaptive pooling for video captioning. In AI4TV 2019 - Proceedings of the 1st International Workshop on AI for Smart TV Content Production, Access and Delivery, co-located with MM 2019 (pp. 33–41). Association for Computing Machinery, Inc. https://doi.org/10.1145/3347449.3357484

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free