Emotions and personality play an essential role in human behavior, their considerations, and decision-making. Humans infer emotions from several modals, merging them all. They are an interface between a subject’s internal and external means. This paper presents the design, implementation, and tests of the Inference of an Emotional State (I2E): a cognitive architecture based on emotions for assistive robotics applications, which uses, as inputs, emotions recognized previously by four affective modals who inferred the emotional state to an assistive robot. Unlike solutions that classify emotions, with a single sign, the architecture proposed in this article will merge four sources of information about emotions into one. For this inference to be closer to a human being, a Fuzzy System Mamdani was used to infer the user’s personalities, and a MultiLayer Perceptron (MLP) was used to infer the robot’s personality. The hypothesis tested in this work was based on the Mehrabian studies and in addition to three experts in psychologists. The I2E architecture proved to be quite efficient for identifying an emotion with various types of input.
CITATION STYLE
Martins, P. S., Faria, G., & Cerqueira, J. de J. F. (2020). I2E: A cognitive architecture based on emotions for assistive robotics applications. Electronics (Switzerland), 9(10), 1–19. https://doi.org/10.3390/electronics9101590
Mendeley helps you to discover research relevant for your work.