Extending item response theory to online homework

17Citations
Citations of this article
49Readers
Mendeley users who have this article in their library.

Abstract

Item response theory (IRT) becomes an increasingly important tool when analyzing "big data" gathered from online educational venues. However, the mechanism was originally developed in traditional exam settings, and several of its assumptions are infringed upon when deployed in the online realm. For a large-enrollment physics course for scientists and engineers, the study compares outcomes from IRT analyses of exam and homework data, and then proceeds to investigate the effects of each confounding factor introduced in the online realm. It is found that IRT yields the correct trends for learner ability and meaningful item parameters, yet overall agreement with exam data is moderate. It is also found that learner ability and item discrimination is robust over a wide range with respect to model assumptions and introduced noise. Item difficulty is also robust, but over a narrower range. © Published by the American Physical Society.

Cite

CITATION STYLE

APA

Kortemeyer, G. (2014). Extending item response theory to online homework. Physical Review Special Topics - Physics Education Research, 10(1). https://doi.org/10.1103/PhysRevSTPER.10.010118

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free