Inattentive responses can threaten measurement quality, yet they are common in rating- or Likert-scale data. In this study, we proposed a new mixture item response theory model to distinguish inattentive responses from normal responses so that test validity can be ascertained. Simulation studies demonstrated that the parameters of the new model were recovered fairly well using the Bayesian methods implemented in the freeware WinBUGS, and fitting the new model to data that lacked inattentive responses did not result in severely biased parameter estimates. In contrast, ignoring inattentive responses by fitting standard item response theory models to data containing inattentive responses yielded seriously biased parameter estimates and a failure to distinguish inattentive participants from normal participants; the person-fit statistic lz was also unsatisfactory in identifying inattentive responses. Two empirical examples demonstrate the applications of the new model.
CITATION STYLE
Jin, K. Y., Chen, H. F., & Wang, W. C. (2018). Mixture Item Response Models for Inattentive Responding Behavior. Organizational Research Methods, 21(1), 197–225. https://doi.org/10.1177/1094428117725792
Mendeley helps you to discover research relevant for your work.