AttentionRank: Unsupervised keyphrase Extraction using Self and Cross Attentions

46Citations
Citations of this article
75Readers
Mendeley users who have this article in their library.

Abstract

Keyword or keyphrase extraction is to identify words or phrases presenting the main topics of a document. This paper proposes the AttentionRank, a hybrid attention model, to identify keyphrases from a document in an unsupervised manner. AttentionRank calculates self-attention and cross-attention using a pre-trained language model. The self-attention is designed to determine the importance of a candidate within the context of a sentence. The cross-attention is calculated to identify the semantic relevance between a candidate and sentences within a document. We evaluate the AttentionRank on three publicly available datasets against seven baselines. The results show that the AttentionRank is an effective and robust unsupervised keyphrase extraction model on both long and short documents. Source code is available on Github.

Cite

CITATION STYLE

APA

Ding, H., & Luo, X. (2021). AttentionRank: Unsupervised keyphrase Extraction using Self and Cross Attentions. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 1919–1928). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.146

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free