MEM2Seq: Effectively incorporating knowledge bases into end-to-end task-oriented dialog systems

197Citations
Citations of this article
448Readers
Mendeley users who have this article in their library.

Abstract

End-to-end task-oriented dialog systems usually suffer from the challenge of incorporating knowledge bases. In this paper, we propose a novel yet simple end-to-end differentiable model called memory-to-sequence (Mem2Seq) to address this issue. Mem2Seq is the first neural generative model that combines the multi-hop attention over memories with the idea of pointer network. We empirically show how Mem2Seq controls each generation step, and how its multi-hop attention mechanism helps in learning correlations between memories. In addition, our model is quite general without complicated task-specific designs. As a result, we show that Mem2Seq can be trained faster and attain the state-of-the-art performance on three different task-oriented dialog datasets.

Cite

CITATION STYLE

APA

Madotto, A., Wu, C. S., & Fung, P. (2018). MEM2Seq: Effectively incorporating knowledge bases into end-to-end task-oriented dialog systems. In ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (Vol. 1, pp. 1468–1478). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p18-1136

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free