Previous works on syntactically controlled paraphrase generation heavily rely on large-scale parallel paraphrase data that are not easily available for many languages and domains. In this paper, we take this research direction to the extreme and investigate whether it is possible to learn syntactically controlled paraphrase generation with non-parallel data. We propose a syntactically-informed unsupervised paraphrasing model based on conditional variational auto-encoder (VAE) which can generate texts in a specified syntactic structure. Particularly, we design a two-stage learning method to effectively train the model using non-parallel data. The conditional VAE is trained to reconstruct the input sentence according to the given input and its syntactic structure. Furthermore, to improve the syntactic controllability and semantic consistency of the pre-trained conditional VAE, we fine-tune it using syntax controlling and cycle reconstruction learning objectives, and employ Gumbel-Softmax to combine these new learning objectives. Experiment results demonstrate that the proposed model trained only on non-parallel data is capable of generating diverse paraphrases with specified structures. Additionally, we further validate the effectiveness of our method for generating syntactically adversarial examples on a sentiment analysis task. Source codes are available at https://github.com/lanse-sir/sup.
CITATION STYLE
Yang, E., Liu, M., Xiong, D., Zhang, Y., Meng, Y., Hu, C., … Chen, Y. (2021). Syntactically-Informed Unsupervised Paraphrasing with Non-Parallel Data. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 2594–2604). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.203
Mendeley helps you to discover research relevant for your work.