Exploring different dimensions of attention for uncertainty detection

31Citations
Citations of this article
122Readers
Mendeley users who have this article in their library.

Abstract

Neural networks with attention have proven effective for many natural language processing tasks. In this paper, we develop attention mechanisms for uncertainty detection. In particular, we generalize standardly used attention mechanisms by introducing external attention and sequence-preserving attention. These novel architectures differ from standard approaches in that they use external resources to compute attention weights and preserve sequence information. We compare them to other configurations along different dimensions of attention. Our novel architectures set the new state of the art on a Wikipedia benchmark dataset and perform similar to the state-of-the-art model on a biomedical benchmark which uses a large set of linguistic features.

Cite

CITATION STYLE

APA

Adel, H., & Schütze, H. (2017). Exploring different dimensions of attention for uncertainty detection. In 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference (Vol. 1, pp. 22–34). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/e17-1003

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free