Enhancing the recurrent neural networks with positional gates for sentence representation

2Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The recurrent neural networks (RNN) with attention mechanism have shown good performance for answer selection in recent years. Most previous attention mechanisms focus on generating the attentive weights after obtaining all the hidden states, while the contextual information from the other sentence is not well studied during the internal hidden state generation. In this paper, we propose a position gated RNN (PG-RNN) model, which merges the positional contextual information of the question words for the inner hidden state generation. Specifically, we first design a positional interaction monitor to detect and measure the positional influence of question word within answer sentence. Then we present a positional gating mechanism and embed it into RNN to automatically absorb the positional contextual information for the hidden state update. Experiments on two benchmark datasets, namely TREC-QA and WikiQA, show the great advantages of our proposed model. In particular, we achieve the new state-of-the-art performance on TREC-QA and WikiQA.

Cite

CITATION STYLE

APA

Song, Y., Hu, W., Chen, Q., Hu, Q., & He, L. (2018). Enhancing the recurrent neural networks with positional gates for sentence representation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11301 LNCS, pp. 511–521). Springer Verlag. https://doi.org/10.1007/978-3-030-04167-0_46

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free