Towards formal multimodal analysis of emotions for affective computing

14Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

Social robotics is related to the robotic systems and human interaction. Social robots have applications in elderly care, health care, home care, customer service and reception in industrial settings. Human-Robot Interaction (HRI) requires better understanding of human emotion. There are few multimodal fusion systems that integrate limited amount of facial expression, speech and gesture analysis. In this paper, we describe the implementation of a semantic algebra based formal model that integrates six basic facial expressions, speech phrases and gesture trajectories. The system is capable of real-time interaction. We used the decision level fusion approach for integration and the prototype system has been implemented using Matlab.

Cite

CITATION STYLE

APA

Ghayoumi, M., Thafa, M., & Bansal, A. K. (2016). Towards formal multimodal analysis of emotions for affective computing. In Proceedings - DMS 2016: 22nd International Conference on Distributed Multimedia Systems (pp. 48–54). Knowledge Systems Institute Graduate School. https://doi.org/10.18293/DMS2016-030

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free