Main point generator: Summarizing with a focus

3Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Text summarization is attracting more and more attention while deep neural network has had many successful application in NLP. One problem of such models is its inability to focus on the essentials of documents, thus generating summaries that may not be important, especially during multi-sentence summarization. In this paper, we propose Main Pointer Generator (MPG) to address the problem, where at each decoder step the whole document is taken into consideration when calculating the probability of next generated token. We experiment with CNN/Daily news corpus and results show that summaries our MPG generated follow the main theme while outperforming the original pointer generator network by about 0.5 ROUGE point.

Cite

CITATION STYLE

APA

Chung, T. L., Xu, B., Liu, Y., & Ouyang, C. (2018). Main point generator: Summarizing with a focus. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10827 LNCS, pp. 924–932). Springer Verlag. https://doi.org/10.1007/978-3-319-91452-7_60

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free