A Neural Conversation Generation Model via Equivalent Shared Memory Investigation

1Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Conversation generation as a challenging task in Natural Language Generation (NLG) has been increasingly attracting attention over the last years. A number of recent works adopted sequence-to-sequence structures along with external knowledge, which successfully enhanced the quality of generated conversations. Nevertheless, few works utilized the knowledge extracted from similar conversations for utterance generation. Taking conversations in customer service and court debate domains as examples, it is evident that essential entities/phrases, as well as their associated logic and inter-relationships, can be extracted and borrowed from similar conversation instances. Such information could provide useful signals for improving conversation generation. In this paper, we propose a novel reading and memory framework called Deep Reading Memory Network (DRMN) which is capable of remembering useful information of similar conversations for improving utterance generation. We apply our model to two large-scale conversation datasets of justice and e-commerce fields. Experiments prove that the proposed model outperforms the state-of-the-art approaches.

Cite

CITATION STYLE

APA

Ji, C., Zhang, Y., Liu, X., Jatowt, A., Sun, C., Zhu, C., & Zhao, T. (2021). A Neural Conversation Generation Model via Equivalent Shared Memory Investigation. In International Conference on Information and Knowledge Management, Proceedings (pp. 771–781). Association for Computing Machinery. https://doi.org/10.1145/3459637.3482407

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free