The purpose of this chapter is to investigate how an object oriented (OO) architecture can be adapted to cope with multimodal emotion recognition applications with mobile interfaces. A large obstacle in this direction is the fact that mobile phones differ from desktop computers since mobile phones are not capable of performing the demanding processing required as in emotion recognition. To surpass this fact, in our approach, mobile phones are required to transmit all data collected to a server which is responsible for performing, among other, emotion recognition. The object oriented architecture that we have created, combines evidence from multiple modalities of interaction, namely the mobile device's keyboard and the mobile device's microphone, as well as data from user stereotypes. All collected information is classified into well-structured objects which have their own properties and methods. The resulting emotion detection platform is capable of processing and re-transmitting information from different mobile sources of multimodal data during human-computer interaction. The interface that has been used as a test bed for the affective mobile interaction is that of an educational m-learning application. © Springer-Verlag Berlin Heidelberg 2014.
CITATION STYLE
Alepis, E., & Virvou, M. (2014). Object oriented design for multiple modalities in affective interaction. Intelligent Systems Reference Library, 64, 87–99. https://doi.org/10.1007/978-3-642-53851-3_8
Mendeley helps you to discover research relevant for your work.