Encodingword confusion networks with recurrent neural networks for dialog state tracking

6Citations
Citations of this article
80Readers
Mendeley users who have this article in their library.

Abstract

This paper presents our novel method to encode word confusion networks, which can represent a rich hypothesis space of automatic speech recognition systems, via recurrent neural networks. We demonstrate the utility of our approach for the task of dialog state tracking in spoken dialog systems that relies on automatic speech recognition output. Encoding confusion networks outperforms encoding the best hypothesis of the automatic speech recognition in a neural system for dialog state tracking on the well-known second Dialog State Tracking Challenge dataset.

Cite

CITATION STYLE

APA

Jagfeld, G., & Vu, N. T. (2017). Encodingword confusion networks with recurrent neural networks for dialog state tracking. In EMNLP 2017 - 1st Workshop on Speech-Centric Natural Language Processing, SCNLP 2017 - Proceedings of the Workshop (pp. 10–17). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w17-4602

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free