Principled Paraphrase Generation with Parallel Corpora

7Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.

Abstract

Round-trip Machine Translation (MT) is a popular choice for paraphrase generation, which leverages readily available parallel corpora for supervision. In this paper, we formalize the implicit similarity function induced by this approach, and show that it is susceptible to non-paraphrase pairs sharing a single ambiguous translation. Based on these insights, we design an alternative similarity metric that mitigates this issue by requiring the entire translation distribution to match, and implement a relaxation of it through the Information Bottleneck method. Our approach incorporates an adversarial term into MT training in order to learn representations that encode as much information about the reference translation as possible, while keeping as little information about the input as possible. Paraphrases can be generated by decoding back to the source from this representation, without having to generate pivot translations. In addition to being more principled and efficient than round-trip MT, our approach offers an adjustable parameter to control the fidelity-diversity trade-off, and obtains better results in our experiments.

Cite

CITATION STYLE

APA

Ormazabal, A., Artetxe, M., Soroa, A., Labaka, G., & Agirre, E. (2022). Principled Paraphrase Generation with Parallel Corpora. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 1621–1638). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.114

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free