Learning Non-Autoregressive Models from Search for Unsupervised Sentence Summarization

14Citations
Citations of this article
44Readers
Mendeley users who have this article in their library.

Abstract

Text summarization aims to generate a short summary for an input text. In this work, we propose a Non-Autoregressive Unsupervised Summarization (NAUS) approach, which does not require parallel data for training. Our NAUS first performs edit-based search towards a heuristically defined score, and generates a summary as pseudo-groundtruth. Then, we train an encoder-only non-autoregressive Transformer based on the search result. We also propose a dynamic programming approach for length-control decoding, which is important for the summarization task. Experiments on two datasets show that NAUS achieves state-of-the-art performance for unsupervised summarization, yet largely improving inference efficiency. Further, our algorithm is able to perform explicit length-transfer summary generation.

Cite

CITATION STYLE

APA

Liu, P., Huang, C., & Mou, L. (2022). Learning Non-Autoregressive Models from Search for Unsupervised Sentence Summarization. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 7916–7929). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.545

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free