A Co-Attentive Cross-Lingual Neural Model for Dialogue Breakdown Detection

2Citations
Citations of this article
63Readers
Mendeley users who have this article in their library.

Abstract

Ensuring smooth communication is essential in a chat-oriented dialogue system, so that a user can obtain meaningful responses through interactions with the system. Most prior work on dialogue research does not focus on preventing dialogue breakdown. One of the major challenges is that a dialogue system may generate an undesired utterance leading to a dialogue breakdown, which degrades the overall interaction quality. Hence, it is crucial for a machine to detect dialogue breakdowns in an ongoing conversation. In this paper, we propose a novel dialogue breakdown detection model that jointly incorporates a pretrained cross-lingual language model and a co-attention network. Our proposed model leverages effective word embeddings trained on one hundred different languages to generate contextualized representations. Co-attention aims to capture the interaction between the latest utterance and the conversation history, and thereby determines whether the latest utterance causes a dialogue breakdown. Experimental results show that our proposed model outperforms all previous approaches on all evaluation metrics in both the Japanese and English tracks in Dialogue Breakdown Detection Challenge 4 (DBDC4 at IWSDS2019).

Cite

CITATION STYLE

APA

Lin, Q., Kundu, S., & Ng, H. T. (2020). A Co-Attentive Cross-Lingual Neural Model for Dialogue Breakdown Detection. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 4201–4210). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.371

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free