Multimodal learning analytics to inform learning design: Lessons learned from computing education

47Citations
Citations of this article
109Readers
Mendeley users who have this article in their library.

Abstract

Programming is a complex learning activity that involves coordination of cognitive processes and affective states. These aspects are often considered individually in computing education research, demonstrating limited understanding of how and when students learn best. This issue confines researchers to contextualize evidence-driven outcomes when learning behaviour deviates from pedagogical intentions. Multimodal learning analytics (MMLA) captures data essential for measuring constructs (e.g., cognitive load, confusion) that are posited in the learning sciences as important for learning, and cannot effectively be measured solely with the use of programming process data (IDE-log data). Thus, we augmented IDE-log data with physiological data (e.g., gaze data) and participants’ facial expressions, collected during a debugging learning activity. The findings emphasize the need for learning analytics that are consequential for learning, rather than easy and convenient to collect. In that regard, our paper aims to provoke productive reflections and conversations about the potential of MMLA to expand and advance the synergy of learning analytics and learning design among the community of educators from a post-evaluation design-aware process to a permanent monitoring process of adaptation.

Cite

CITATION STYLE

APA

Mangaroska, K., Sharma, K., Gašević, D., & Giannakos, M. (2020). Multimodal learning analytics to inform learning design: Lessons learned from computing education. Journal of Learning Analytics, 7(3), 79–97. https://doi.org/10.18608/JLA.2020.73.7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free