Neural speed reading audited

2Citations
Citations of this article
62Readers
Mendeley users who have this article in their library.

Abstract

Several approaches to neural speed reading have been presented at major NLP and machine learning conferences in 2017–20; i.e.,”human-inspired” recurrent network architectures that learn to”read” text faster by skipping irrelevant words, typically optimizing the joint objective of minimizing classification error rate and FLOPs used at inference time. This paper reflects on the meaningfulness of the speed reading task, showing that (a) better and faster approaches to, say, document classification, already exist, which also learn to ignore part of the input (I give an example with 7% error reduction and a 136x speed-up over the state of the art in neural speed reading); and that (b) any claims that neural speed reading is”human-inspired”, are ill-founded.

Cite

CITATION STYLE

APA

Søgaard, A. (2020). Neural speed reading audited. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 148–153). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free