Relation Classification with Cognitive Attention Supervision

6Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Many current language models such as BERT utilize attention mechanisms to transform sequence representations. We ask whether we can influence BERT's attention with human reading patterns by using eye-tracking and brain imaging data. We fine-tune BERT for relation extraction with auxiliary attention supervision in which BERT's attention weights are supervised by cognitive data. Through a variety of metrics we find that this attention supervision can be used to increase similarity between model attention distributions over sequences and the cognitive data without significantly affecting classification performance while making unique errors from the baseline. In particular, models with cognitive attention supervision more often correctly classified samples misclassified by the baseline.

Cite

CITATION STYLE

APA

McGuire, E. S., & Tomuro, N. (2021). Relation Classification with Cognitive Attention Supervision. In CMCL 2021 - Workshop on Cognitive Modeling and Computational Linguistics, Proceedings (pp. 222–232). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.cmcl-1.26

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free