Are achievement gap estimates biased by differential student test effort? Putting an important policy metric to the test

27Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Background/Context: Achievement gaps motivate a range of practices and policies aimed at closing those gaps. Most gaps studies assume that differences in observed test scores across subgroups are measuring differences in content mastery. For such an assumption to hold, students in the subgroups being compared need to be giving similar effort on the test. Studies already show that low test effort is prevalent and biases observed test scores downward. What research does not demonstrate is whether test effort differs by subgroup and, therefore, biases estimates of achievement gaps. Purpose: This study examines whether test effort differs by student subgroup, including by race and gender. The sensitivity of achievement gap estimates to any differences in test effort is also considered. Research Design: A behavioral proxy for test effort called "rapid guessing" was used. Rapid guessing occurs when students answer a test item so fast, they could not have understood its content. Rates of rapid guessing were compared across subgroups. Then, achievement gaps were estimated unconditional and conditional on measures of rapid guessing. Findings: Test effort differs substantially by subgroup, with males rapidly guessing nearly twice as often as females in later grades, and Black students rapidly guessing more often than White students. However, these differences in rapid guessing generally do not impact substantive interpretations of achievement gaps, though basic conclusions about male-female gaps and changes in gaps as students progress through school may change when models account for test effort. Conclusions: Although the bias introduced into achievement gap estimates by differential test effort is hard to quantify, results provide an important reminder that test scores reflect achievement only to the extent that students are willing and able to demonstrate what they have learned. Understanding why there are subgroup differences in test effort would likely be useful to educators and is worthy of additional study.

Cite

CITATION STYLE

APA

Soland, J. (2018). Are achievement gap estimates biased by differential student test effort? Putting an important policy metric to the test. Teachers College Record, 121(12). https://doi.org/10.1177/016146811812001202

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free