Intrapersonal dependencies in multimodal behavior

3Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Human interlocutors automatically adapt verbal and non-verbal signals so that different behaviors become synchronized over time. Multimodal communication comes naturally to humans, while this is not the case for Embodied Conversational Agents (ECAs). Knowing which behavioral channels synchronize within and across speakers and how they align seems critical in the development of ECAs. Yet, there exists little data-driven research that provides guidelines for the synchronization of different channels within an interlocutor. This study focuses on intrapersonal dependencies of multimodal behavior by using cross-recurrence analysis on a multimodal communication dataset to better understand the temporal relationships between language and gestural behavior channels. By shedding light on the intrapersonal synchronization of communicative channels in humans, we provide an initial manual for modality synchronisation in ECAs.

Cite

CITATION STYLE

APA

Blomsma, P. A., Linders, G. M., Vaitonyte, J., & Louwerse, M. M. (2020). Intrapersonal dependencies in multimodal behavior. In Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents, IVA 2020. Association for Computing Machinery, Inc. https://doi.org/10.1145/3383652.3423872

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free