Incorporating Distributions of Discourse Structure for Long Document Abstractive Summarization

13Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.
Get full text

Abstract

For text summarization, the role of discourse structure is pivotal in discerning the core content of a text. Regrettably, prior studies on incorporating Rhetorical Structure Theory (RST) into transformer-based summarization models only consider the nuclearity annotation, thereby overlooking the variety of discourse relation types. This paper introduces the 'RSTformer', a novel summarization model that comprehensively incorporates both the types and uncertainty of rhetorical relations. Our RST-attention mechanism, rooted in document-level rhetorical structure, is an extension of the recently devised Longformer framework. Through rigorous evaluation, the model proposed herein exhibits significant superiority over state-of-the-art models, as evidenced by its notable performance on several automatic metrics and human evaluation.

Cite

CITATION STYLE

APA

Pu, D., Wang, Y., & Demberg, V. (2023). Incorporating Distributions of Discourse Structure for Long Document Abstractive Summarization. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 5574–5590). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.306

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free