Recent Neural Methods on Dialogue State Tracking for Task-Oriented Dialogue Systems: A Survey

38Citations
Citations of this article
63Readers
Mendeley users who have this article in their library.

Abstract

This paper aims at providing a comprehensive overview of recent developments in dialogue state tracking (DST) for task-oriented conversational systems. We introduce the task, the main datasets that have been exploited as well as their evaluation metrics, and we analyze several proposed approaches. We distinguish between static ontology DST models, which predict a fixed set of dialogue states, and dynamic ontology models, which can predict dialogue states even when the ontology changes. We also discuss the model's ability to track either single or multiple domains and to scale to new domains, both in terms of knowledge transfer and zero-shot learning. We cover a period from 2013 to 2020, showing a significant increase of multiple domain methods, most of them utilizing pre-trained language models.

Cite

CITATION STYLE

APA

Balaraman, V., Sheikhalishahi, S., & Magnini, B. (2021). Recent Neural Methods on Dialogue State Tracking for Task-Oriented Dialogue Systems: A Survey. In SIGDIAL 2021 - 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue, Proceedings of the Conference (pp. 239–251). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.sigdial-1.25

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free