Learning analytics in mobile applications based on multimodal interaction

4Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.
Get full text

Abstract

One of the most valuable skills for teachers is the ability to produce their own digital solutions, translating teaching concepts into end-user computer systems. This often requires the involvement of computing specialists. As a result, the development of educational programming environments remains a challenge. Learning experiences based multimodal interaction applications (gesture interaction, voice recognition or artificial vision) are becoming commonplace in education because they motivate and involve students. This chapter analyses the state-of-the-art in LA techniques and user-friendly authoring tools. It presents a tool to support the creation of multimodal interactive applications equipped with non-intrusive monitoring and analytics capabilities. This tool enables teachers with no programming skills to create interactive LA-enriched learning scenarios. To this end, several components that manage LA activities are included in the tool, they range from automatically capturing users’ interaction with mobile applications, to querying data and retrieving metrics, to visualising tables and charts.

Cite

CITATION STYLE

APA

Mota, J. M., Ruiz-Rube, I., Dodero, J. M., Person, T., & Arnedillo-Sánchez, I. (2018). Learning analytics in mobile applications based on multimodal interaction. In Lecture Notes on Data Engineering and Communications Technologies (Vol. 11, pp. 67–92). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-319-68318-8_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free