Object oriented design for multiple modalities in affective interaction

3Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The purpose of this chapter is to investigate how an object oriented (OO) architecture can be adapted to cope with multimodal emotion recognition applications with mobile interfaces. A large obstacle in this direction is the fact that mobile phones differ from desktop computers since mobile phones are not capable of performing the demanding processing required as in emotion recognition. To surpass this fact, in our approach, mobile phones are required to transmit all data collected to a server which is responsible for performing, among other, emotion recognition. The object oriented architecture that we have created, combines evidence from multiple modalities of interaction, namely the mobile device's keyboard and the mobile device's microphone, as well as data from user stereotypes. All collected information is classified into well-structured objects which have their own properties and methods. The resulting emotion detection platform is capable of processing and re-transmitting information from different mobile sources of multimodal data during human-computer interaction. The interface that has been used as a test bed for the affective mobile interaction is that of an educational m-learning application. © Springer-Verlag Berlin Heidelberg 2014.

Cite

CITATION STYLE

APA

Alepis, E., & Virvou, M. (2014). Object oriented design for multiple modalities in affective interaction. Intelligent Systems Reference Library, 64, 87–99. https://doi.org/10.1007/978-3-642-53851-3_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free