Few-shot Egocentric Multimodal Activity Recognition

3Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Activity recognition based on egocentric multimodal data collected by wearable devices has become increasingly popular recently. However, conventional activity recognition methods face the dilemma of the lack of large-scale labeled egocentric multimodal datasets due to the high cost of data collection. In this paper, we propose a new task of few-shot egocentric multimodal activity recognition, which has at least two significant challenges. On the one hand, it is difficult to extract effective features from the multimodal data sequences of video and sensor signals due to the scarcity of the samples. On the other hand, how to robustly recognize novel activity classes with very few labeled samples becomes another more critical challenge due to the complexity of the multimodal data. To resolve the challenges, we propose a two-stream graph network, which consists of a heterogeneous graph-based multimodal association module and a knowledge-aware activity classifier module. The former uses a heterogeneous graph network to comprehensively capture the dynamic and complementary information contained in the multimodal data stream. The latter learns robust activity classifiers through knowledge propagation among the classifier parameters of different classes. In addition, we adopt episodic training strategy to improve the generalization ability of the proposed few-shot activity recognition model. Experiments on two public datasets show that the proposed model achieves better performances than other baseline models.

Cite

CITATION STYLE

APA

Pan, J., Yang, X., Huang, Y., & Xu, C. (2021). Few-shot Egocentric Multimodal Activity Recognition. In ACM International Conference Proceeding Series. Association for Computing Machinery. https://doi.org/10.1145/3469877.3490603

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free