Memory Attention Neural Network for Multi-domain Dialogue State Tracking

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In a task-oriented dialogue system, the dialogue state tracker aims to generate a structured summary (domain-slot-value triples) over the whole dialogue utterance. However, existing approaches generally fail to make good use of pre-defined ontologies. In this paper, we propose a novel Memory Attention State Tracker that considers ontologies as prior knowledge and utilizes Memory Network to store such information. Our model is composed of an utterance encoder, an attention-based query generator, a slot gate classifier, and ontology Memory Networks for every domain-slot pair. To make a fair comparison with previous approaches, we also conduct experiments with RNN instead of pre-trained BERT as the encoder. Empirical results show that our model achieves a compatible joint accuracy on MultiWoz 2.0 dataset and MultiWoz 2.1 dataset.

Cite

CITATION STYLE

APA

Xu, Z., Chen, Z., Chen, L., Zhu, S., & Yu, K. (2020). Memory Attention Neural Network for Multi-domain Dialogue State Tracking. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12430 LNAI, pp. 41–52). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-60450-9_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free