Retrieval-Augmented Transformer-XL for Close-Domain Dialog Generation

3Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

Transformer-based models have demonstrated excellent capabilities of capturing patterns and structures in natural language generation and achieved state-of-the-art results in many tasks. In this paper we present a transformer-based model for multi-turn dialog response generation. Our solution is based on a hybrid approach which augments a transformer-based generative model with a novel retrieval mechanism, which leverages the memorized information in the training data via k-Nearest Neighbor search. Our system is evaluated on two datasets made by customer/assistant dialogs: the Taskmaster-1, released by Google and holding high quality, goal-oriented conversational data and a proprietary dataset collected from a real customer service call center. Both achieve better BLEU scores over strong baselines.

Cite

CITATION STYLE

APA

Bonetta, G., Cancelliere, R., Liu, D., & Vozila, P. (2021). Retrieval-Augmented Transformer-XL for Close-Domain Dialog Generation. In Proceedings of the International Florida Artificial Intelligence Research Society Conference, FLAIRS (Vol. 34). Florida Online Journals, University of Florida. https://doi.org/10.32473/flairs.v34i1.128369

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free