CBR tagging of emotions from facial expressions

5Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Mobility and context-awareness are two active research directions that open new potential to recommender systems. Usage of dynamically enriched information from the user context leads the system to find better solutions that are adapted to the specific situations. In this paper we focus on the difficult problem of dynamically acquiring the emotional context about the user during a recommendation process. We use the fact that emotions are tightly connected with facial expressions and it is difficult for people to hide emotions in facial expressions. We describe PhotoMood, a CBR system that uses gestures to identify emotions in faces, and present preliminary experiments with MadridLive, a mobile and context aware recommender system for leisure activities in Madrid. In the experiments, the momentary emotion of a user is dynamically detected from pictures of the facial expression taken unobtrusively with the front facing camera of the mobile device.

Cite

CITATION STYLE

APA

Lopez-de-arenosa, P., Díaz-agudo, B., & Recio-garcía, J. A. (2014). CBR tagging of emotions from facial expressions. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 8765, 245–259. https://doi.org/10.1007/978-3-319-11209-1_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free