Discovering Out-of-Domain(OOD) intents is essential for developing new skills in a task-oriented dialogue system. The key challenge is how to transfer prior IND knowledge to OOD clustering. Different from existing work based on shared intent representation, we propose a novel disentangled knowledge transfer method via a unified multi-head contrastive learning framework. We aim to bridge the gap between IND pre-training and OOD clustering. Experiments and analysis on two benchmark datasets show the effectiveness of our method.
CITATION STYLE
Mou, Y., He, K., Wu, Y., Zeng, Z., Xu, H., Jiang, H., … Xu, W. (2022). Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 46–53). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-short.6
Mendeley helps you to discover research relevant for your work.