Unsupervised document summarization using pre-trained sentence embeddings and graph centrality

4Citations
Citations of this article
48Readers
Mendeley users who have this article in their library.

Abstract

This paper describes our submission for the LongSumm task in SDP 2021. We propose a method for incorporating sentence embeddings produced by deep language models into extractive summarization techniques based on graph centrality in an unsupervised manner. The proposed method is simple, fast, can summarize any document of any size and can satisfy any length constraints for the summaries produced. The method offers competitive performance to more sophisticated supervised methods and can serve as a proxy for abstractive summarization techniques.

Cite

CITATION STYLE

APA

Ramirez-Orta, J., & Milios, E. (2021). Unsupervised document summarization using pre-trained sentence embeddings and graph centrality. In 2nd Workshop on Scholarly Document Processing, SDP 2021 - Proceedings of the Workshop (pp. 110–115). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.sdp-1.14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free