Abstract
This paper describes our ongoing effort to build an empathizing and adaptive storyteller system. The system under development aims to utilize emotional expressions generated from an avatar or a humanoid robot in addition to the listener's responses which are monitored in real time, in order to deliver a story in an effective manner. We conducted a pilot study and the results were analyzed in two ways: first, through a survey questionnaire analysis based on the participant's subjective ratings; second, through automated video analysis based on the participant's emotional facial expression and eye blinking. The survey questionnaire results show that male participants have a tendency of more empathizing with a story character when a virtual storyteller is present, as compared to audio-only narration. The video analysis results show that the number of eye blinking of the participants is thought to be reciprocal to their attention. Copyright © 2012, Association for the Advancement of Artificial Intelligence. All rights reserved.
Cite
CITATION STYLE
Bae, B. C., Brunete, A., Malik, U., Dimara, E., Jermsurawong, J., & Mavridis, N. (2012). Towards an empathizing and adaptive storyteller system. In AAAI Workshop - Technical Report (Vol. WS-12-14, pp. 63–65). https://doi.org/10.1609/aiide.v8i2.12532
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.