Multidimensional CAT item selection methods for domain scores and composite scores with item exposure control and content constraints

18Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The intent of this research was to find an item selection procedure in the multidimensional computer adaptive testing (CAT) framework that yielded higher precision for both the domain and composite abilities, had a higher usage of the item pool, and controlled the exposure rate. Five multidimensional CAT item selection procedures (minimum angle volume minimum error variance of the linear combination; minimum error variance of the composite score with optimized weight; and Kullback-Leibler information) were studied and compared with two methods for item exposure control (the Sympson-Hetter procedure and the fixed-rate procedure, the latter simply refers to putting a limit on the item exposure rate) using simulated data. The maximum priority index method was used for the content constraints. Results showed that the Sympson-Hetter procedure yielded better precision than the fixed-rate procedure but had much lower item pool usage and took more time. The five item selection procedures performed similarly under Sympson-Hetter. For the fixed-rate procedure, there was a trade-off between the precision of the ability estimates and the item pool usage: the five procedures had different patterns. It was found that (1) Kullback-Leibler had better precision but lower item pool usage (2) minimum angle and volume had balanced precision and item pool usage and (3) the two methods minimizing the error variance had the best item pool usage and comparable overall score recovery but less precision for certain domains. The priority index for content constraints and item exposure was implemented successfully. © 2014 by the National Council on Measurement in Education.

Cite

CITATION STYLE

APA

Yao, L. (2014). Multidimensional CAT item selection methods for domain scores and composite scores with item exposure control and content constraints. Journal of Educational Measurement, 51(1), 18–38. https://doi.org/10.1111/jedm.12032

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free