Sequence classification with human attention

95Citations
Citations of this article
224Readers
Mendeley users who have this article in their library.

Abstract

Learning attention functions requires large volumes of data, but many NLP tasks simulate human behavior, and in this paper, we show that human attention really does provide a good inductive bias on many attention functions in NLP. Specifically, we use estimated human attention derived from eye-tracking corpora to regularize attention functions in recurrent neural networks. We show substantial improvements across a range of tasks, including sentiment analysis, grammatical error detection, and detection of abusive language.

Cite

CITATION STYLE

APA

Barrett, M., Bingel, J., Hollenstein, N., Rei, M., & Søgaard, A. (2018). Sequence classification with human attention. In CoNLL 2018 - 22nd Conference on Computational Natural Language Learning, Proceedings (pp. 302–312). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/k18-1030

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free