The emergence of immersive digital technologies, such as shared augmented reality (shAR), virtual reality (VR) and motion capture (MC) offers promising new opportunities to advance our understanding of human cognition and design innovative technology-enhanced learning experiences. Theoretical frameworks for embodied and extended cognition can guide novel ways in which learning in these environments can be understood and analyzed. This conceptual paper explores a research method in Educational Technology—multimodal analysis for embodied technologies—and provides examples from shAR, VR, and MC projects that use this approach. This analysis involves tracking learners’ gestures, actions on physical and virtual objects, whole body movements and positions, and their talk moves, in addition to other relevant modalities (e.g., written inscriptions), over time and across space. We show how this analysis allows for new considerations to arise relating to the design of educational technology to promote collaboration, to more fully capture students’ knowledge, and to understand and leverage the perspectives of learners.
CITATION STYLE
Walkington, C., Nathan, M. J., Huang, W., Hunnicutt, J., & Washington, J. (2023). Multimodal analysis of interaction data from embodied education technologies. Educational Technology Research and Development. https://doi.org/10.1007/s11423-023-10254-9
Mendeley helps you to discover research relevant for your work.