Sensor-Augmented Egocentric-Video Captioning with Dynamic Modal Attention

8Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Automatically describing video, or video captioning, has been widely studied in the multimedia field. This paper proposes a new task of sensor-augmented egocentric-video captioning, a newly constructed dataset for it called MMAC Captions, and a method for the newly proposed task that effectively utilizes multi-modal data of video and motion sensors, or inertial measurement units (IMUs). While conventional video captioning tasks have difficulty in dealing with detailed descriptions of human activities due to the limited view of a fixed camera, egocentric vision has greater potential to be used for generating the finer-grained descriptions of human activities on the basis of a much closer view. In addition, we utilize wearable-sensor data as auxiliary information to mitigate the inherent problems in egocentric vision: motion blur, self-occlusion, and out-of-camera-range activities. We propose a method for effectively utilizing the sensor data in combination with the video data on the basis of an attention mechanism that dynamically determines the modality that requires more attention, taking the contextual information into account. We compared the proposed sensor-fusion method with strong baselines on the MMAC Captions dataset and found that using sensor data as supplementary information to the egocentric-video data was beneficial, and that our proposed method outperformed the strong baselines, demonstrating the effectiveness of the proposed method.

Cite

CITATION STYLE

APA

Nakamura, K., Ohashi, H., & Okada, M. (2021). Sensor-Augmented Egocentric-Video Captioning with Dynamic Modal Attention. In MM 2021 - Proceedings of the 29th ACM International Conference on Multimedia (pp. 4220–4229). Association for Computing Machinery, Inc. https://doi.org/10.1145/3474085.3475557

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free