Social robotics is related to the robotic systems and human interaction. Social robots have applications in elderly care, health care, home care, customer service and reception in industrial settings. Human-Robot Interaction (HRI) requires better understanding of human emotion. There are few multimodal fusion systems that integrate limited amount of facial expression, speech and gesture analysis. In this paper, we describe the implementation of a semantic algebra based formal model that integrates six basic facial expressions, speech phrases and gesture trajectories. The system is capable of real-time interaction. We used the decision level fusion approach for integration and the prototype system has been implemented using Matlab.
CITATION STYLE
Ghayoumi, M., Thafa, M., & Bansal, A. K. (2016). Towards formal multimodal analysis of emotions for affective computing. In Proceedings - DMS 2016: 22nd International Conference on Distributed Multimedia Systems (pp. 48–54). Knowledge Systems Institute Graduate School. https://doi.org/10.18293/DMS2016-030
Mendeley helps you to discover research relevant for your work.