Cross-language generative automatic summarization based on attention mechanism

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Generative automatic summarization is a basic problem in natural language processing. We propose a cross-language generative automatic summarization model. Unlike the traditional methods that have to go through machine translation, our model can directly generate a text summary of another language from a text body in one language. We use the RNNLM(Recurrent Neural Network based Language Model) structure to pre-train word vectors to obtain semantic information in different languages. We combined the Soft Attention mechanism in the Seq2Seq model, using Chinese, Korean and English to build a parallel corpus to train the model, thereby, cross-language automatic summarization can be achieved without the help of machine translation technology. Experiments show that the improvement of our proposed model on ROUGE-1, ROUGE-2 and ROUGE-L indicators reached 6%, 2.46%, and 5.13%, respectively. The experimental effect is friendly.

Cite

CITATION STYLE

APA

Yang, F., Cui, R., Yi, Z., & Zhao, Y. (2020). Cross-language generative automatic summarization based on attention mechanism. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12432 LNCS, pp. 236–247). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-60029-7_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free