Abstract
Advances in the areas of face and gesture analysis, computational paralinguistics, multimodal interaction, and human-computer interaction have all played a major role in shaping research into assistive technologies over the last decade. This has resulted in a breadth of practical applications ranging from diagnosis and treatment tools to social companion technologies. From an analytical perspective, nonverbal cues provide understanding into the assessment of mental health and wellbeing (i.e., detecting depression and pain) and the detection of developmental and neurological conditions such as autism, dementia, and schizophrenia. From both a synthesis and generative perspective, it is necessary that assistive technologies, either disembodied or embodied, are capable of generating engaging, interactive behaviours and interventions that are personalised and adapted to user's needs, profiles, and preferences. While nonverbal cues play an essential role, there are still many key issues to overcome, which affect both the development and the deployment of multimodal technologies in real-world settings. The key aim of this multidisciplinary workshop is to foster cross-pollination by bringing together computer scientists and social psychologists to discuss innovative ideas, challenges and opportunities for understanding and generating multimodal nonverbal cues within the scope of healthcare applications1.
Author supplied keywords
Cite
CITATION STYLE
Celiktutan, O., Georgescu, A. L., & Cummins, N. (2021). Socially Informed AI for Healthcare: Understanding and Generating Multimodal Nonverbal Cues. In ICMI 2021 - Proceedings of the 2021 International Conference on Multimodal Interaction (pp. 874–876). Association for Computing Machinery, Inc. https://doi.org/10.1145/3462244.3480984
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.