Evaluating the fairness of a high-stakes college entrance exam in Kuwait

3Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The use of college entrance exams for facilitating admission decisions become controversial, and the central argument is around the fairness of test scores. The Kuwait University English Aptitude Test (KUEAT) is a high-stakes test, but very few studies have examined the psychometric quality of the scores for this national-level assessment. This study illustrates how measurement approaches can be used to examine the fairness issues in educational testing. Through a modern view of fairness, we assess the internal and external bias of KUEAT scores using differential item functioning analysis and differential prediction analysis, respectively, and provide a comprehensive fairness argument for KUEAT scores. The analysis for examining the internal evidence of bias was based on 1790 examinees’ KUEAT scores in November 2018. KUEAT scores and first-year college GPAs of 4033 students enrolled in KU were used for assessing the external evidence of bias. Results revealed many items showing differential item functioning across student subpopulation groups (i.e., nationality, gender, high school majors, and high school types). Meanwhile, KUEAT scores also predicted college performance differentially by different student subgroups (i.e., nationality, high school majors, and high school types). Discussion and implications on the fairness issues of college entrance tests in Kuwait are provided.

Cite

CITATION STYLE

APA

Shamsaldeen, F., Wang, J., & Ahn, S. (2024). Evaluating the fairness of a high-stakes college entrance exam in Kuwait. Language Testing in Asia, 14(1). https://doi.org/10.1186/s40468-024-00301-4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free