Optimizing NLU Reranking Using Entity Resolution Signals in Multi-domain Dialog Systems

2Citations
Citations of this article
50Readers
Mendeley users who have this article in their library.

Abstract

In dialog systems, the Natural Language Understanding (NLU) component typically makes the interpretation decision (including domain, intent and slots) for an utterance before the mentioned entities are resolved. This may result in intent classification and slot tagging errors. In this work, we propose to leverage Entity Resolution (ER) features in NLU reranking and introduce a novel loss term based on ER signals to better learn model weights in the reranking framework. In addition, for a multi-domain dialog scenario, we propose a score distribution matching method to ensure scores generated by the NLU reranking models for different domains are properly calibrated. In offline experiments, we demonstrate our proposed approach significantly outperforms the baseline model on both single-domain and cross-domain evaluations.

Cite

CITATION STYLE

APA

Wang, T., Chen, J., Malmir, M., Dong, S., He, X., Wang, H., … Liu, Y. (2021). Optimizing NLU Reranking Using Entity Resolution Signals in Multi-domain Dialog Systems. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Industry Papers (pp. 19–25). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-industry.3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free