Global encoding for abstractive summarization

115Citations
Citations of this article
272Readers
Mendeley users who have this article in their library.

Abstract

In neural abstractive summarization, the conventional sequence-to-sequence (seq2seq) model often suffers from repetition and semantic irrelevance. To tackle the problem, we propose a global encoding framework, which controls the information flow from the encoder to the decoder based on the global information of the source context. It consists of a convolutional gated unit to perform global encoding to improve the representations of the source-side information. Evaluations on the LCSTS and the English Gigaword both demonstrate that our model outperforms the baseline models, and the analysis shows that our model is capable of generating summary of higher quality and reducing repetition.

Cite

CITATION STYLE

APA

Lin, J., Sun, X., Ma, S., & Su, Q. (2018). Global encoding for abstractive summarization. In ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (Vol. 2, pp. 163–169). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p18-2027

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free