Human Activity Recognition for Assisted Living Based on Scene Understanding

8Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

The growing share of the population over the age of 65 is putting pressure on the social health insurance system, especially on institutions that provide long-term care services for the elderly or to people who suffer from chronic diseases or mental disabilities. This pressure can be reduced through the assisted living of the patients, based on an intelligent system for monitoring vital signs and home automation. In this regard, since 2008, the European Commission has financed the development of medical products and services through the ambient assisted living (AAL) program—Ageing Well in the Digital World. The SmartCare Project, which integrates the proposed Computer Vision solution, follows the European strategy on AAL. This paper presents an indoor human activity recognition (HAR) system based on scene understanding. The system consists of a ZED 2 stereo camera and a NVIDIA Jetson AGX processing unit. The recognition of human activity is carried out in two stages: all humans and objects in the frame are detected using a neural network, then the results are fed to a second network for the detection of interactions between humans and objects. The activity score is determined based on the human–object interaction (HOI) detections.

Cite

CITATION STYLE

APA

Achirei, S. D., Heghea, M. C., Lupu, R. G., & Manta, V. I. (2022). Human Activity Recognition for Assisted Living Based on Scene Understanding. Applied Sciences (Switzerland), 12(21). https://doi.org/10.3390/app122110743

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free