In this study, we propose sentence simplification from a non-parallel corpus with adversarial learning. In recent years, sentence simplification based on a statistical machine translation framework and neural networks have been actively studied. However, most methods require a large parallel corpus, which is expensive to build. In this paper, our purpose is sentence simplification with a nonparallel corpus in open data en-Wikipedia and Simple-Wikipedia articles. We use a style transfer framework with adversarial learning for learning by non-parallel corpus and adapted a prior work [by Barzilay et al.] to sentence simplification as a base framework. Furthermore, from the perspective of improving retention of sentence meaning, we add pretraining reconstruction loss and cycle consistency loss to the base framework. We also improve the sentence quality output from the proposed model as a result of the expansion.
CITATION STYLE
Kawashima, T., & Takagi, T. (2019). Sentence simplification from non-parallel corpus with adversarial learning. In Proceedings - 2019 IEEE/WIC/ACM International Conference on Web Intelligence, WI 2019 (pp. 43–50). Association for Computing Machinery, Inc. https://doi.org/10.1145/3350546.3352499
Mendeley helps you to discover research relevant for your work.