Real-Time Recognition of Daily Actions Based on 3D Joint Movements and Fisher Encoding

3Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recognition of daily actions is an essential part of Ambient Assisted Living (AAL) applications and still not fully solved. In this work, we propose a novel framework for the recognition of actions of daily living from depth-videos. The framework is based on low-level human pose movement descriptors extracted from 3D joint trajectories as well as differential values that encode speed and acceleration information. The joints are detected using a depth sensor. The low-level descriptors are then aggregated into discriminative high-level action representations by modeling prototype pose movements with Gaussian Mixtures and then using a Fisher encoding schema. The resulting Fisher vectors are suitable to train Linear SVM classifiers so as to recognize actions in pre-segmented video clips, alleviating the need for additional parameter search with non-linear kernels or neural network tuning. Experimental evaluation on two well-known RGB-D action datasets reveal that the proposed framework achieves close to state-of-the-art performance whilst maintaining high processing speeds.

Cite

CITATION STYLE

APA

Giannakeris, P., Meditskos, G., Avgerinakis, K., Vrochidis, S., & Kompatsiaris, I. (2020). Real-Time Recognition of Daily Actions Based on 3D Joint Movements and Fisher Encoding. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11962 LNCS, pp. 601–613). Springer. https://doi.org/10.1007/978-3-030-37734-2_49

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free