Generalized Processing Tree Models: Jointly Modeling Discrete and Continuous Variables

15Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Multinomial processing tree models assume that discrete cognitive states determine observed response frequencies. Generalized processing tree (GPT) models extend this conceptual framework to continuous variables such as response times, process-tracing measures, or neurophysiological variables. GPT models assume finite-mixture distributions, with weights determined by a processing tree structure, and continuous components modeled by parameterized distributions such as Gaussians with separate or shared parameters across states. We discuss identifiability, parameter estimation, model testing, a modeling syntax, and the improved precision of GPT estimates. Finally, a GPT version of the feature comparison model of semantic categorization is applied to computer-mouse trajectories.

Cite

CITATION STYLE

APA

Heck, D. W., Erdfelder, E., & Kieslich, P. J. (2018). Generalized Processing Tree Models: Jointly Modeling Discrete and Continuous Variables. Psychometrika, 83(4), 893–918. https://doi.org/10.1007/s11336-018-9622-0

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free