Hie-BART: Document Summarization with Hierarchical BART

23Citations
Citations of this article
87Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper proposes a new abstractive summarization model for documents, hierarchical BART (Hie-BART), which captures the hierarchical structures of documents (i.e., their sentence-word structures) in the BART model. Although the existing BART model has achieved state-of-the-art performance on document summarization tasks, it does not account for interactions between sentence-level and word-level information. In machine translation tasks, the performance of neural machine translation models can be improved with the incorporation of multi-granularity self-attention (MG-SA), which captures relationships between words and phrases. Inspired by previous work, the proposed Hie-BART model incorporates MG-SA into the encoder of the BART model for capturing sentence-word structures. Evaluations performed on the CNN/Daily Mail dataset show that the proposed Hie-BART model outperforms strong baselines and improves the performance of a non-hierarchical BART model (+0.23 ROUGE-L).

Cite

CITATION STYLE

APA

Akiyama, K., Tamura, A., & Ninomiya, T. (2021). Hie-BART: Document Summarization with Hierarchical BART. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Student Research Workshop (pp. 159–165). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-srw.20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free