Successful social encounters crucially depend to a large degree on the smooth exchange of nonverbal cues between two or more interaction partners. Impairments in exchanging nonverbal cues are characteristic of developmental disorders such as autism spectrum disorder (ASD). Thus, modelling nonverbal behaviours is a well-fitting means for developing automatic diagnostic tools. In this paper, we focus on the computational analysis of nonverbal behaviours in dyadic social interactions between two adults (dyads). We studied three dyad types, composed of either two typical individuals, two autistic individuals or one typical and one autistic individual. We extracted both individual features (i.e., head, hand and leg movement) and interpersonal features (i.e., mutual gaze and head, hand and leg synchrony) from videos, which were subsequently used to train two classifiers. Our results show that the proposed approach is able to detect ASD at a performance of 70% and recognise dyad type at a performance of 72% in terms of F-Score, which has implications for minimally invasive autism screening.
CITATION STYLE
Celiktutan, O., Wu, W., Vogeley, K., & Georgescu, A. L. (2023). A Computational Approach for Analysing Autistic Behaviour During Dyadic Interactions. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13643 LNCS, pp. 167–177). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-37660-3_12
Mendeley helps you to discover research relevant for your work.