Keyphrase generation with word attention

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Keyphrase generation aims to generate several words that can simply summarize the semantics of the article, which is the basis of many natural language processing tasks. Although most previous approaches have achieved good results, they neglect the independent word information in the source text. Previous models use attention mechanism to calculate the relationship between the encoder RNN hidden states and the target side. However hidden state $$h:t$$ is the summarization of the first t words as a subsequence in the source sentence. We, in this paper, propose a novel sequence-to-sequence model called WordRNN, which can capture word level representations in the source text. Our model can enrich the expression of the source text by directly promoting the pure word level information. Moreover, we use fuse gate and simply concat operation to combine the subsequence level and word level contextual information. Experiment results demonstrate that our approach achieves higher performance than the state-of-the-art methods.

Cite

CITATION STYLE

APA

Huang, H., Huang, T., Ma, L., & Zhang, L. (2019). Keyphrase generation with word attention. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11955 LNCS, pp. 270–281). Springer. https://doi.org/10.1007/978-3-030-36718-3_23

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free