An improved end-to-end memory network for Qa tasks

4Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

At present, End-to-End trainable Memory Networks (MemN2N) has proven to be promising in many deep learning fields, especially on simple natural language-based reasoning question and answer (QA) tasks. However, when solving some subtasks such as basic induction, path finding or time reasoning tasks, it remains challenging because of limited ability to learn useful information between memory and query. In this paper, we propose a novel gated linear units (GLU) and local-attention based end-to-end memory networks (MemN2N-GL) motivated by the success of attention mechanism theory in the field of neural machine translation, it shows an improved possibility to develop the ability of capturing complex memory-query relations and works better on some subtasks. It is an improved end-to-end memory network for QA tasks. We demonstrate the effectiveness of these approaches on the 20 bAbI dataset which includes 20 challenging tasks, without the use of any domain knowledge. Our project is open source on github4

References Powered by Scopus

Effective approaches to attention-based neural machine translation

4136Citations
N/AReaders
Get full text

Hierarchical neural story generation

842Citations
N/AReaders
Get full text

Alexa, Siri, Cortana, and More: An Introduction to Voice Assistants

684Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Recent progress in leveraging deep learning methods for question answering

47Citations
N/AReaders
Get full text

AI-based Question Answering Assistance for Analyzing Natural-language Requirements

9Citations
N/AReaders
Get full text

Memory-based deep neural attention (mDNA) for cognitive multi-turn response retrieval in task-oriented chatbots

9Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Wulamu, A., Sun, Z., Xie, Y., Xu, C., & Yang, A. (2019). An improved end-to-end memory network for Qa tasks. Computers, Materials and Continua, 60(3), 1283–1295. https://doi.org/10.32604/cmc.2019.07722

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 4

57%

Researcher 2

29%

Lecturer / Post doc 1

14%

Readers' Discipline

Tooltip

Computer Science 7

88%

Business, Management and Accounting 1

13%

Save time finding and organizing research with Mendeley

Sign up for free