Understanding social interpersonal interaction via synchronization templates of facial events

3Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

Automatic facial expression analysis in inter-personal communication is challenging. Not only because conversation partners' facial expressions mutually influence each other, but also because no correct interpretation of facial expressions is possible without taking social context into account. In this paper, we propose a probabilistic framework to model interactional synchronization between conversation partners based on their facial expressions. Interactional synchronization manifests temporal dynamics of conversation partners' mutual influence. In particular, the model allows us to discover a set of common and unique facial synchronization templates directly from natural interpersonal interaction without recourse to any predefined labeling schemes. The facial synchronization templates represent periodical facial event coordinations shared by multiple conversation pairs in a specific social context. We test our model on two different dyadic conversations of negotiation and job-interview. Based on the discovered facial event coordination, we are able to predict their conversation outcomes with higher accuracy than HMMs and GMMs.

Cite

CITATION STYLE

APA

Li, R., Curhan, J., & Hoque, M. E. (2018). Understanding social interpersonal interaction via synchronization templates of facial events. In 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 1579–1586). AAAI press. https://doi.org/10.1609/aaai.v32i1.11514

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free