Low-level linguistic controls for style transfer and content preservation

7Citations
Citations of this article
76Readers
Mendeley users who have this article in their library.

Abstract

Despite the success of style transfer in image processing, it has seen limited progress in natural language generation. Part of the problem is that content is not as easily decoupled from style in the text domain. Curiously, in the field of stylometry, content does not figure prominently in practical methods of discriminating stylistic elements, such as authorship and genre. Rather, syntax and function words are the most salient features. Drawing on this work, we model style as a suite of low-level linguistic controls, such as frequency of pronouns, prepositions, and subordinate clause constructions. We train a neural encoder-decoder model to reconstruct reference sentences given only content words and the setting of the controls. We perform style transfer by keeping the content words fixed while adjusting the controls to be indicative of another style. In experiments, we show that the model reliably responds to the linguistic controls and perform both automatic and manual evaluations on style transfer. We find we can fool a style classifier 84% of the time, and that our model produces highly diverse and stylistically distinctive outputs. This work introduces a formal, extendable model of style that can add control to any neural text generation system.

Cite

CITATION STYLE

APA

Gero, K. I., Kedzie, C., Reeve, J., & Chilton, L. B. (2019). Low-level linguistic controls for style transfer and content preservation. In INLG 2019 - 12th International Conference on Natural Language Generation, Proceedings of the Conference (pp. 208–218). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/W19-8628

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free