Attention-based memory network for sentence-level question answering

1Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Sentence-level question answering (QA) for news articles is a promising task for social media, whose task is to make machine understand a news article and answer a corresponding question with an answer sentence selected from the news article. Recently, several deep neural networks have been proposed for sentence-level QA. For the best of our knowledge, none of them explicitly use keywords that appear simultaneously in questions and documents. In this paper we introduce the Attention-based Memory Network (Att-MemNN), a new iterative bi-directional attention memory network that predicts answer sentences. It exploits the co-occurrence of keywords among questions and documents as augment inputs of deep neural network and embeds documents and corresponding questions in different way, processing questions with word-level and contextual-level embedding while processing documents only with word-level embedding. Experimental results on the test set of NewsQA show that our model yields great improvement. We also use quantitative and qualitative analysis to show the results intuitively.

Cite

CITATION STYLE

APA

Liu, P., Zhang, C., Zhang, W., Zhan, Z., & Zhuang, B. (2017). Attention-based memory network for sentence-level question answering. In Communications in Computer and Information Science (Vol. 774, pp. 104–115). Springer Verlag. https://doi.org/10.1007/978-981-10-6805-8_9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free