A Good Classifier is Not Enough: A XAI Approach for Urgent Instructor-Intervention Models in MOOCs

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deciding upon instructor intervention based on learners’ comments that need an urgent response in MOOC environments is a known challenge. The best solutions proposed used automatic machine learning (ML) models to predict the urgency. These are ‘black-box’-es, with results opaque to humans. EXplainable artificial intelligence (XAI) is aiming to understand these, to enhance trust in artificial intelligence (AI)-based decision-making. We propose to apply XAI techniques to interpret a MOOC intervention model, by analysing learner comments. We show how pairing a good predictor with XAI results and especially colour-coded visualisation could be used to support instructors making decisions on urgent intervention.

Cite

CITATION STYLE

APA

Alrajhi, L., Pereira, F. D., Cristea, A. I., & Aljohani, T. (2022). A Good Classifier is Not Enough: A XAI Approach for Urgent Instructor-Intervention Models in MOOCs. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13356 LNCS, pp. 424–427). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-11647-6_84

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free