A Simple and Efficient Multi-Task Learning Approach for Conditioned Dialogue Generation

13Citations
Citations of this article
79Readers
Mendeley users who have this article in their library.

Abstract

Conditioned dialogue generation suffers from the scarcity of labeled responses. In this work, we exploit labeled non-dialogue text data related to the condition, which are much easier to collect. We propose a multi-task learning approach to leverage both labeled dialogue and text data. The 3 tasks jointly optimize the same pre-trained Transformer – conditioned dialogue generation task on the labeled dialogue data, conditioned language encoding task and conditioned language generation task on the labeled text data. Experimental results show that our approach outperforms the state-of-the-art models by leveraging the labeled texts, and it also obtains larger improvement in performance comparing to the previous methods to leverage text data.

References Powered by Scopus

A Coefficient of Agreement for Nominal Scales

31986Citations
N/AReaders
Get full text

CIDEr: Consensus-based image description evaluation

3710Citations
N/AReaders
Get full text

A diversity-promoting objective function for neural conversation models

1520Citations
N/AReaders
Get full text

Cited by Powered by Scopus

A Survey of Controllable Text Generation Using Transformer-based Pre-trained Language Models

125Citations
N/AReaders
Get full text

Pre-Trained Language Models for Text Generation: A Survey

59Citations
N/AReaders
Get full text

DER-GCN: Dialog and Event Relation-Aware Graph Convolutional Neural Network for Multimodal Dialog Emotion Recognition

11Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Zeng, Y., & Nie, J. Y. (2021). A Simple and Efficient Multi-Task Learning Approach for Conditioned Dialogue Generation. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 4927–4939). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-main.392

Readers over time

‘21‘22‘23‘24‘2508162432

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 19

73%

Researcher 4

15%

Lecturer / Post doc 2

8%

Professor / Associate Prof. 1

4%

Readers' Discipline

Tooltip

Computer Science 26

81%

Linguistics 4

13%

Neuroscience 1

3%

Social Sciences 1

3%

Save time finding and organizing research with Mendeley

Sign up for free
0