Neural syntactic preordering for controlled paraphrase generation

86Citations
Citations of this article
191Readers
Mendeley users who have this article in their library.

Abstract

Paraphrasing natural language sentences is a multifaceted process: it might involve replacing individual words or short phrases, local rearrangement of content, or high-level restructuring like topicalization or passivization. Past approaches struggle to cover this space of paraphrase possibilities in an interpretable manner. Our work, inspired by pre-ordering literature in machine translation, uses syntactic transformations to softly “reorder” the source sentence and guide our neural paraphrasing model. First, given an input sentence, we derive a set of feasible syntactic rearrangements using an encoder-decoder model. This model operates over a partially lexical, partially syntactic view of the sentence and can reorder big chunks. Next, we use each proposed rearrangement to produce a sequence of position embeddings, which encourages our final encoder-decoder paraphrase model to attend to the source words in a particular order. Our evaluation, both automatic and human, shows that the proposed system retains the quality of the baseline approaches while giving a substantial increase in the diversity of the generated paraphrases.

Cite

CITATION STYLE

APA

Goyal, T., & Durrett, G. (2020). Neural syntactic preordering for controlled paraphrase generation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 238–252). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free