Human activity recognition based on transfer learning with spatio-temporal representations

6Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A Gait History Image (GHI) is a spatial template that accumulates regions of motion into a single image in which moving pixels are brighter than others. A new descriptor named Time-Sliced Averaged Gradient Boundary Magnitude (TAGBM) is also designed to show the time variations of motion. The spatial and temporal information of each video can be condensed using these templates. Based on this opinion, a new method is proposed in this paper. Each video is split into N and M groups of consecutive frames, and the GHI and TAGBM are computed for each group, resulting spatial and temporal templates. Transfer learning with the fine-tuning technique has been used for classifying these templates. This proposed method achieves the recognition accuracies of 96.50%, 92.30% and 97.12% for KTH, UCF Sport and UCF-11 action datasets, respectively. Also it is compared with state-of-the-art approaches and the results show that the proposed method has the best performance.

Cite

CITATION STYLE

APA

Zebhi, S., Almodarresi, S. M. T., & Abootalebi, V. (2021). Human activity recognition based on transfer learning with spatio-temporal representations. International Arab Journal of Information Technology, 18(6), 839–845. https://doi.org/10.34028/iajit/18/6/11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free