Recognition of instrumental activities of daily living in egocentric video for activity monitoring of patients with dementia

4Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this chapter we study the problem of recognizing Instrumental Activities of Daily Living (IADL) in egocentric camera view. The target application of this research is the indexing of videos of patients with Alzehimer disease, thus providing medical staff with fast access and easy navigation through the video contents and helping them while assessing patients’ abilities to perform IADL. Driven by the consideration that an activity in egocentric videos can be defined as a sequence of interacted objects inside different rooms, we present a novel representation based on the output of object and room detectors over temporal segments. In addition, our object detection approach is extended by automatic detection of visually salient regions since distinguishing active objects from context has been proven to dramatically improve performances in egocentric ADL recognition. We have assessed our proposal on a publicly available egocentric dataset and show extensive experimental results that demonstrate our approach outperforms current state of the art for unconstrained scenarios in which training and testing environments may be notably different.

Cite

CITATION STYLE

APA

González-Díaz, I., Buso, V., Benois-Pineau, J., Bourmaud, G., Usseglio, G., Mégret, R., … Dartigues, J. F. (2015). Recognition of instrumental activities of daily living in egocentric video for activity monitoring of patients with dementia. In Health Monitoring and Personalized Feedback using Multimedia Data (pp. 161–178). Springer International Publishing. https://doi.org/10.1007/978-3-319-17963-6_9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free