Recognizing activities of daily living is useful for ambient assisted living. In this regard, the use of wearable cameras is a promising technology. In this paper, we propose a novel approach for recognizing activities of daily living using egocentric viewpoint video clips. First, in every frame the appearing objects are detected and labelled depending if they are being used or not by the subject. Later, the video clip is divided into spatio temporal bins created with an object centric cut. Finally, a support vector machine classifier is computed using a spatio-temporal flexible kernel between video clips. The validity of the proposed method has been proved by conducting experiments in the ADL dataset. Results confirm the suitability of using the space-time location of objects as information for the classification of activities using an egocentric viewpoint.
CITATION STYLE
Rodriguez, M., Orrite, C., & Medrano, C. (2017). Space-time flexible kernel for recognizing activities from wearable cameras. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10255 LNCS, pp. 511–518). Springer Verlag. https://doi.org/10.1007/978-3-319-58838-4_56
Mendeley helps you to discover research relevant for your work.