Semi-supervised text simplification with back-translation and asymmetric denoising autoencoders

27Citations
Citations of this article
36Readers
Mendeley users who have this article in their library.

Abstract

Text simplification (TS) rephrases long sentences into simplified variants while preserving inherent semantics. Traditional sequence-to-sequence models heavily rely on the quantity and quality of parallel sentences, which limits their applicability in different languages and domains. This work investigates how to leverage large amounts of unpaired corpora in TS task. We adopt the back-translation architecture in unsupervised machine translation (NMT), including denoising autoencoders for language modeling and automatic generation of parallel data by iterative back-translation. However, it is non-trivial to generate appropriate complex-simple pair if we directly treat the set of simple and complex corpora as two different languages, since the two types of sentences are quite similar and it is hard for the model to capture the characteristics in different types of sentences. To tackle this problem, we propose asymmetric denoising methods for sentences with separate complexity. When modeling simple and complex sentences with autoencoders, we introduce different types of noise into the training process. Such a method can significantly improve the simplification performance. Our model can be trained in both unsupervised and semi-supervised manner. Automatic and human evaluations show that our unsupervised model outperforms the previous systems, and with limited supervision, our model can perform competitively with multiple state-of-the-art simplification systems.

Cite

CITATION STYLE

APA

Zhao, Y., Chen, L., Chen, Z., & Yu, K. (2020). Semi-supervised text simplification with back-translation and asymmetric denoising autoencoders. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 9668–9675). AAAI press. https://doi.org/10.1609/aaai.v34i05.6515

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free