ASSESSING DIFFERENTIAL ITEM FUNCTIONING IN PERFORMANCE TESTS

  • Zwick R
  • Donoghue J
  • Grima A
N/ACitations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Although the belief has been expressed that performance assessments are intrinsically more fair than multiple‐choice measures, some forms of performance assessment may in fact be more likely than conventional tests to tap construct‐irrelevant factors. As performance assessment grows in popularity, it will be increasingly important to monitor the validity and fairness of alternative item types. The assessment of differential item functioning (DIF), as one component of this evaluation, can be helpful in investigating the effect on subpopulations of the introduction of performance tasks. Developing a DIF analysis strategy for performance measures requires decisions as to how the matching variable should be defined and how the analysis procedure should accommodate polytomous responses. In this study, two inferential procedures and two types of descriptive summaries that may be useful in assessing DIF in performance measures were explored and applied to simulated data. All the investigated statistics appear to be worthy of further study.

Cite

CITATION STYLE

APA

Zwick, R., Donoghue, J. R., & Grima, A. (1993). ASSESSING DIFFERENTIAL ITEM FUNCTIONING IN PERFORMANCE TESTS. ETS Research Report Series, 1993(1). https://doi.org/10.1002/j.2333-8504.1993.tb01525.x

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free