Multimodal Analytics for Real-Time Feedback in Co-located Collaboration

28Citations
Citations of this article
58Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Collaboration is an important 21st century skill; it can take place in a remote or co-located setting. Co-located collaboration (CC) is a very complex process which involves subtle human interactions that can be described with multimodal indicators (MI) like gaze, speech and social skills. In this paper, we first give an overview of related work that has identified indicators during CC. Then, we look into the state-of-the-art studies on feedback during CC which also make use of MI. Finally, we describe a Wizard of Oz (WOz) study where we design a privacy-preserving research prototype with the aim to facilitate real-time collaboration in-the-wild during three co-located group PhD meetings (of 3–7 members). Here, human observers stationed in another room act as a substitute for sensors to track different speech-based cues (like speaking time and turn taking); this drives a real-time visualization dashboard on a public shared display. With this research prototype, we want to pave way for design-based research to track other multimodal indicators of CC by extending this prototype design using both humans and sensors.

Cite

CITATION STYLE

APA

Praharaj, S., Scheffel, M., Drachsler, H., & Specht, M. (2018). Multimodal Analytics for Real-Time Feedback in Co-located Collaboration. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11082 LNCS, pp. 187–201). Springer Verlag. https://doi.org/10.1007/978-3-319-98572-5_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free