Lexical Knowledge Internalization for Neural Dialog Generation

6Citations
Citations of this article
50Readers
Mendeley users who have this article in their library.

Abstract

We propose knowledge internalization (KI), which aims to complement the lexical knowledge into neural dialog models. Instead of further conditioning the knowledge-grounded dialog (KGD) models on externally retrieved knowledge, we seek to integrate knowledge about each input token internally into the model's parameters. To tackle the challenge due to the large scale of lexical knowledge, we adopt the contrastive learning approach and create an effective token-level lexical knowledge retriever that requires only weak supervision mined from Wikipedia. We demonstrate the effectiveness and general applicability of our approach on various datasets and diversified model structures.

Cite

CITATION STYLE

APA

Wu, Z., Bi, W., Li, X., Kong, L., & Kao, B. (2022). Lexical Knowledge Internalization for Neural Dialog Generation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 7945–7958). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.547

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free