The Impact of Computers on Marking Behaviors and Assessment: A Many-Facet Rasch Measurement Analysis of Essays by EFL College Students

1Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This study employed a mixed-design approach and the Many-Facet Rasch Measurement (MFRM) framework to investigate whether rater bias occurred between the onscreen scoring (OSS) mode and the paper-based scoring (PBS) mode. Nine human raters analytically marked scanned scripts and paper scripts using a six-category (i.e., six-criterion) rating rubric. Interviews with these raters were then conducted to gather their reflections concerning the marking experiences in the two modes. Software program FACETS was employed to estimate raters’ scores, with the results indicating that (a) four raters marked the scanned scripts more severely than they did the paper scripts, whereas the remaining four raters exhibited an opposite tendency; (b) the only rater whose composite scores were comparable between the two modes still awarded harsher scores to the category of Mechanics in PBS mode; and (c) the category of Mechanics was scored more harshly in OSS mode, whereas Topic Development/Support was scored more severely in PBS mode. Analyses of quantitative and qualitative data indicated that computers affected raters’ marking behaviors and their assessment, assuming a role in causing scoring bias. Implications for rater training were provided.

Cite

CITATION STYLE

APA

He, T. H. (2019). The Impact of Computers on Marking Behaviors and Assessment: A Many-Facet Rasch Measurement Analysis of Essays by EFL College Students. SAGE Open, 9(2). https://doi.org/10.1177/2158244019846692

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free