Activity recognition in egocentric life-logging videos

23Citations
Citations of this article
38Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With the increasing availability of wearable cameras, research on first-person view videos (egocentric videos) has received much attention recently.While some effort has been devoted to collecting various egocentric video datasets, there has not been a focused effort in assembling one that could capture the diversity and complexity of activities related to life-logging, which is expected to be an important application for egocentric videos. In this work, we first conduct a comprehensive survey of existing egocentric video datasets. We observe that existing datasets do not emphasize activities relevant to the life-logging scenario. We build an egocentric video dataset dubbed LENA (Life-logging EgoceNtric Activities) (http://people.sutd.edu.sg/∼1000892/dataset) which includes egocentric videos of 13 fine-grained activity categories, recorded under diverse situations and environments using the Google Glass. Activities in LENA can also be grouped into 5 top-level categories to meet various needs and multiple demands for activities analysis research. We evaluate state-ofthe- art activity recognition using LENA in detail and also analyze the performance of popular descriptors in egocentric activity recognition.

Cite

CITATION STYLE

APA

Song, S., Chandrasekhar, V., Cheung, N. M., Narayan, S., Li, L., & Lim, J. H. (2015). Activity recognition in egocentric life-logging videos. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9010, pp. 445–458). Springer Verlag. https://doi.org/10.1007/978-3-319-16634-6_33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free