Detecting, modeling, and making sense of multimodal data from human users in the wild still poses numerous challenges. Starting from aspects of data quality and reliability of our measurement instruments, the multidisciplinary endeavor of developing intelligent adaptive systems in human-computer or human-robot interaction (HCI, HRI) requires a broad range of expertise and more integrative efforts to make such systems reliable, engaging, and user-friendly. At the same time, the spectrum of applications for machine learning and modeling of multimodal data in the wild keeps expanding. From the classroom to the robot-assisted operation theatre, our workshop aims to support a vibrant exchange about current trends and methods in the field of modeling multimodal data in the wild.
CITATION STYLE
Küster, D., Putze, F., Alves-Oliveira, P., Paetzel, M., & Schultz, T. (2020). Modeling Socio-Emotional and Cognitive Processes from Multimodal Data in the Wild. In ICMI 2020 - Proceedings of the 2020 International Conference on Multimodal Interaction (pp. 883–885). Association for Computing Machinery, Inc. https://doi.org/10.1145/3382507.3420053
Mendeley helps you to discover research relevant for your work.