Attentional sequence-to-sequence models based on RNN have achieved promising performances in the automatic abstractive summarization technology. However, there are still some shortcomings, including the inaccuracies and lack of key information. In this paper, an abstractive text summarization model based on the hybrid attention mechanism is presented. The model aims at adopting sentence-level attention mechanism to guide the word-level attention distribution. Moreover, this model modulates the weight of sentence-level attention value to alleviate the adverse effect of high variance on word-level attention distribution for short documents. The experimental results on the LCSTS dataset show that the presented model can effectively improve the ROUGE scores and can better summarize the source document while retaining the important information. An example in this paper shows that the model can generate summaries that are similar to human-written ones.
CITATION STYLE
Wang, Z. (2021). An Automatic Abstractive Text Summarization Model based on Hybrid Attention Mechanism. In Journal of Physics: Conference Series (Vol. 1848). IOP Publishing Ltd. https://doi.org/10.1088/1742-6596/1848/1/012057
Mendeley helps you to discover research relevant for your work.