Multimodal Behavioral Analytics in Intelligent Learning and Assessment Systems

15Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

As the boundary blurs between the real and the virtual in today’s learning environments, there is a growing need for new assessment tools that capture behavioral aspects key to evaluating skills such as problem solving, communication, and collaboration. A key challenge is to capture and understand student behavior at fidelity sufficient to estimate cognitive and affective states as they manifest through multiple media, including speech, body pose, gestures and gaze. However, analyzing each of these modalities in isolation may result in incongruities. In addition, the affective states of a person show significant variations in time. To address these technical challenges, this paper presents a framework for developing hierarchical computational models that provide a systematic approach for extracting meaningful evidence from noisy, unstructured data. This approach utilizes multimodal data, including audio, video, and activity log files and models the temporal dynamics of student behavior patterns. To demonstrate the efficacy of our methodology, we present two pilot studies from the domains of collaborative learning and in vivo assessments of nonverbal behavior where this approach has been successfully implemented.

Cite

CITATION STYLE

APA

Khan, S. M. (2017). Multimodal Behavioral Analytics in Intelligent Learning and Assessment Systems. In Methodology of Educational Measurement and Assessment (pp. 173–184). Springer Nature. https://doi.org/10.1007/978-3-319-33261-1_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free