Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning

17Citations
Citations of this article
43Readers
Mendeley users who have this article in their library.

Abstract

Discovering Out-of-Domain(OOD) intents is essential for developing new skills in a task-oriented dialogue system. The key challenge is how to transfer prior IND knowledge to OOD clustering. Different from existing work based on shared intent representation, we propose a novel disentangled knowledge transfer method via a unified multi-head contrastive learning framework. We aim to bridge the gap between IND pre-training and OOD clustering. Experiments and analysis on two benchmark datasets show the effectiveness of our method.

Cite

CITATION STYLE

APA

Mou, Y., He, K., Wu, Y., Zeng, Z., Xu, H., Jiang, H., … Xu, W. (2022). Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 46–53). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-short.6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free