Multinomial processing tree models assume that discrete cognitive states determine observed response frequencies. Generalized processing tree (GPT) models extend this conceptual framework to continuous variables such as response times, process-tracing measures, or neurophysiological variables. GPT models assume finite-mixture distributions, with weights determined by a processing tree structure, and continuous components modeled by parameterized distributions such as Gaussians with separate or shared parameters across states. We discuss identifiability, parameter estimation, model testing, a modeling syntax, and the improved precision of GPT estimates. Finally, a GPT version of the feature comparison model of semantic categorization is applied to computer-mouse trajectories.
CITATION STYLE
Heck, D. W., Erdfelder, E., & Kieslich, P. J. (2018). Generalized Processing Tree Models: Jointly Modeling Discrete and Continuous Variables. Psychometrika, 83(4), 893–918. https://doi.org/10.1007/s11336-018-9622-0
Mendeley helps you to discover research relevant for your work.