Human Sentence Processing: Recurrence or Attention?

76Citations
Citations of this article
90Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recurrent neural networks (RNNs) have long been an architecture of interest for computational models of human sentence processing. The recently introduced Transformer architecture outperforms RNNs on many natural language processing tasks but little is known about its ability to model human language processing. We compare Transformer- and RNNbased language models' ability to account for measures of human reading effort. Our analysis shows Transformers to outperform RNNs in explaining self-paced reading times and neural activity during reading English sentences, challenging the widely held idea that human sentence processing involves recurrent and immediate processing and provides evidence for cue-based retrieval.

Cite

CITATION STYLE

APA

Merkx, D., & Frank, S. L. (2021). Human Sentence Processing: Recurrence or Attention? In CMCL 2021 - Workshop on Cognitive Modeling and Computational Linguistics, Proceedings (pp. 12–22). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.cmcl-1.2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free