Paper versus electronic feedback in high stakes assessment

5Citations
Citations of this article
34Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Tablet computers have emerged as increasingly useful tools in medical education, particularly for assessment. However, it is not fully established whether tablet computers influence the quality and/or quantity of feedback provided in high stakes assessments. It is also unclear how electronicallyrecorded feedback relates to student performance. Our primary aim was to determine whether differences existed in feedback depending on the tool used to record it. Methods We compared quantitative and qualitative feedback between paper-scoring sheets versus iPads™ across two consecutive years of a final year MBChB (UK medical degree) Objective Structured Clinical Examination. Quality of comments (using a validated five-point rating scale), number of examiner comments and number of words were compared across both methods of recording assessment performance using chi-squared analysis and independent t-test. We also explored relationships between student performance (checklist and global scoring) and feedback. Results Data from 190 students (2850 paper scored interactions) in 2015 and 193 (2895 iPad™ scored interactions) in 2016 were analysed. Overall, a greater number of comments were given with iPad™ compared to written (42% versus 20%; p < 0.001) but the quality of feedback did not differ significantly. For both written and electronic feedback, students with low global scores were more likely to receive comments (p < 0.001). Conclusion The use of iPads™ in high stakes assessment increases the quantity of feedback compared to traditional paper scoring sheets. The quantity and quality of feedback for poorer performing candidates (by global score) were also better with iPad™ feedback.

Cite

CITATION STYLE

APA

Munro, A. J., Cumming, K., Cleland, J., Denison, A. R., & Currie, G. P. (2018). Paper versus electronic feedback in high stakes assessment. Journal of the Royal College of Physicians of Edinburgh, 48(2), 148–152. https://doi.org/10.4997/JRCPE.2018.209

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free