Natural interaction multimodal analysis: Expressivity analysis towards adaptive and personalized interfaces

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Intelligent personalized systems often ignore the affective aspect of human behavior and focus more on tactile cues of the user activity. A complete user modelling, though, should also incorporate cues such as facial expressions, speech prosody and gesture or body posture expressivity features, in order to dynamically profile the user, fusing all available modalities since these qualitative affective cues contain significant information about the user's non verbal behavior and communication. Towards this direction, this work focuses on automatic extraction of gestural and head expressivity features and related statistical processing. The perspective of adopting a common formalization of using expressivity features for a multitude of visual, emotional modalities is explored and grounded through an overview of experiments on appropriate corpora and the corresponding analysis. © 2012 IEEE.

Cite

CITATION STYLE

APA

Asteriadis, S., Caridakis, G., Malatesta, L., & Karpouzis, K. (2012). Natural interaction multimodal analysis: Expressivity analysis towards adaptive and personalized interfaces. In Proceedings - 2012 7th International Workshop on Semantic and Social Media Adaptation and Personalization, SMAP 2012 (pp. 131–136). https://doi.org/10.1109/SMAP.2012.11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free