NAST: A Non-Autoregressive Generator with Word Alignment for Unsupervised Text Style Transfer

15Citations
Citations of this article
66Readers
Mendeley users who have this article in their library.

Abstract

Autoregressive models have been widely used in unsupervised text style transfer. Despite their success, these models still suffer from the content preservation problem that they usually ignore part of the source sentence and generate some irrelevant words with strong styles. In this paper, we propose a Non-Autoregressive generator for unsupervised text Style Transfer (NAST), which alleviates the problem from two aspects. First, we observe that most words in the transferred sentence can be aligned with related words in the source sentence, so we explicitly model word alignments to suppress irrelevant words. Second, existing models trained with the cycle loss align sentences in two stylistic text spaces, which lacks fine-grained control at the word level. The proposed non-autoregressive generator focuses on the connections between aligned words, which learns the word-level transfer between styles. For experiments, we integrate the proposed generator into two base models and evaluate them on two style transfer tasks. The results show that NAST can significantly improve the overall performance and provide explainable word alignments. Moreover, the non-autoregressive generator achieves over 10x speedups at inference. Our codes are available at https://github.com/thu-coai/NAST.

References Powered by Scopus

Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks

14818Citations
N/AReaders
Get full text

A dual reinforcement learning framework for unsupervised text style transfer

113Citations
N/AReaders
Get full text

Cited by Powered by Scopus

A Survey on Non-Autoregressive Generation for Neural Machine Translation and Beyond

57Citations
N/AReaders
Get full text

Stylized Data-to-text Generation: A Case Study in the E-Commerce Domain

10Citations
N/AReaders
Get full text

Context-aware style learning and content recovery networks for neural style transfer

9Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Huang, F., Chen, Z., Wu, C. H., Guo, Q., Zhu, X., & Huang, M. (2021). NAST: A Non-Autoregressive Generator with Word Alignment for Unsupervised Text Style Transfer. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 1577–1590). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.138

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 18

75%

Researcher 4

17%

Professor / Associate Prof. 1

4%

Lecturer / Post doc 1

4%

Readers' Discipline

Tooltip

Computer Science 26

81%

Linguistics 4

13%

Neuroscience 1

3%

Social Sciences 1

3%

Save time finding and organizing research with Mendeley

Sign up for free