Parallel Attention-Driven Model for Student Performance Evaluation

2Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

This study presents the development and evaluation of a Multi-Task Long Short-Term Memory (LSTM) model with an attention mechanism for predicting students’ academic performance. The research is motivated by the need for efficient tools to enhance student assessment and support tailored educational interventions. The model tackles two tasks: predicting overall performance (total score) as a regression task and classifying performance levels (remarks) as a classification task. By handling both tasks simultaneously, it improves computational efficiency and resource utilization. The dataset includes metrics such as Continuous Assessment, Practical Skills, Presentation Quality, Attendance, and Participation. The model achieved strong results, with a Mean Absolute Error (MAE) of 0.0249, Mean Squared Error (MSE) of 0.0012, and Root Mean Squared Error (RMSE) of 0.0346 for the regression task. For the classification task, it achieved perfect scores with an accuracy, precision, recall, and F1 score of 1.0. The attention mechanism enhanced performance by focusing on the most relevant features. This study demonstrates the effectiveness of the Multi-Task LSTM model with an attention mechanism in educational data analysis, offering a reliable and efficient tool for predicting student performance.

Cite

CITATION STYLE

APA

Olaniyan, D., Olaniyan, J., Obagbuwa, I. C., Esiefarienrhe, B. M., & Bernard, O. P. (2024). Parallel Attention-Driven Model for Student Performance Evaluation. Computers, 13(9). https://doi.org/10.3390/computers13090242

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free