This paper describes our submission for the LongSumm task in SDP 2021. We propose a method for incorporating sentence embeddings produced by deep language models into extractive summarization techniques based on graph centrality in an unsupervised manner. The proposed method is simple, fast, can summarize any document of any size and can satisfy any length constraints for the summaries produced. The method offers competitive performance to more sophisticated supervised methods and can serve as a proxy for abstractive summarization techniques.
CITATION STYLE
Ramirez-Orta, J., & Milios, E. (2021). Unsupervised document summarization using pre-trained sentence embeddings and graph centrality. In 2nd Workshop on Scholarly Document Processing, SDP 2021 - Proceedings of the Workshop (pp. 110–115). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.sdp-1.14
Mendeley helps you to discover research relevant for your work.