Ruminating Reader: Reasoning with Gated Multi-Hop Attention

16Citations
Citations of this article
166Readers
Mendeley users who have this article in their library.

Abstract

To answer the question in machine comprehension (MC) task, the models need to establish the interaction between the question and the context. To tackle the problem that the single-pass model cannot reflect on and correct its answer, we present Ruminating Reader. Ruminating Reader adds a second pass of attention and a novel information fusion component to the Bi-Directional Attention Flow model (BIDAF). We propose novel layer structures that construct a query aware context vector representation and fuse encoding representation with intermediate representation on top of BIDAF model. We show that a multi-hop attention mechanism can be applied to a bi-directional attention structure. In experiments on SQuAD, we find that the Reader outperforms the BIDAF baseline by 2.1 F1 score and 2.7 EM score. Our analysis shows that different hops of the attention have different responsibilities in selecting answers.

Cite

CITATION STYLE

APA

Gong, Y., & Bowman, S. R. (2018). Ruminating Reader: Reasoning with Gated Multi-Hop Attention. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 1–11). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-2601

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free