INSET: Sentence infilling with INter-sentential transformer

20Citations
Citations of this article
112Readers
Mendeley users who have this article in their library.

Abstract

Missing sentence generation (or sentence infilling) fosters a wide range of applications in natural language generation, such as document auto-completion and meeting note expansion. This task asks the model to generate intermediate missing sentences that can syntactically and semantically bridge the surrounding context. Solving the sentence infilling task requires techniques in natural language processing ranging from understanding to discourse-level planning to generation. In this paper, we propose a framework to decouple the challenge and address these three aspects respectively, leveraging the power of existing large-scale pre-trained models such as BERT and GPT-2. We empirically demonstrate the effectiveness of our model in learning a sentence representation for generation and further generating a missing sentence that fits the context.

Cite

CITATION STYLE

APA

Huang, Y., Zhang, Y., Elachqar, O., & Cheng, Y. (2020). INSET: Sentence infilling with INter-sentential transformer. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 2502–2515). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.226

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free