Keyphrase generation aims to generate several words that can simply summarize the semantics of the article, which is the basis of many natural language processing tasks. Although most previous approaches have achieved good results, they neglect the independent word information in the source text. Previous models use attention mechanism to calculate the relationship between the encoder RNN hidden states and the target side. However hidden state $$h:t$$ is the summarization of the first t words as a subsequence in the source sentence. We, in this paper, propose a novel sequence-to-sequence model called WordRNN, which can capture word level representations in the source text. Our model can enrich the expression of the source text by directly promoting the pure word level information. Moreover, we use fuse gate and simply concat operation to combine the subsequence level and word level contextual information. Experiment results demonstrate that our approach achieves higher performance than the state-of-the-art methods.
CITATION STYLE
Huang, H., Huang, T., Ma, L., & Zhang, L. (2019). Keyphrase generation with word attention. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11955 LNCS, pp. 270–281). Springer. https://doi.org/10.1007/978-3-030-36718-3_23
Mendeley helps you to discover research relevant for your work.