This paper describes the conceptual model and the implementation of an emotion aware system able to manage multimedia contents (i.e., music tracks) and lightning scenarios, based on the user’s emotion, detected from facial expressions. The system captures the emotions from the user’s face expressions, mapping them into a 2D valence-arousal space where the multimedia content is mapped and matches them with lighting color. A preliminary experimentation involved a total of 26 subjects has been carried out with the purpose of assess the system emotion recognition effectiveness and its ability to manage the environment appropriately. Results evidenced several limits of emotion recognition through face expressions detection and opens to several research challenges.
CITATION STYLE
Altieri, A., Ceccacci, S., & Mengoni, M. (2019). Emotion-aware ambient intelligence: Changing smart environment interaction paradigms through affective computing. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11587 LNCS, pp. 258–270). Springer Verlag. https://doi.org/10.1007/978-3-030-21935-2_20
Mendeley helps you to discover research relevant for your work.