A Timestep aware Sentence Embedding and Acme Coverage for Brief but Informative Title Generation

0Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.

Abstract

The title generation task that summarizes article content in recapitulatory words relies heavily on utilizing the corresponding key context. To generate a title with appropriate information in the content and avoid repetition, we propose a title generation framework with two complementary components in this paper. First, we propose a Timestep aware Sentence Embedding (TSE) mechanism, which updates the sentences' representations by re-locating the critical words in the corresponding sentence for each decoding step. Then, we present an Acme Coverage (AC) mechanism to solve the repetition problem and preserve the remaining valuable keywords after each decoding step according to the final vocabulary distribution. We conduct comprehensive experiments on various title generation tasks with different backbones, the evaluation scores of ROUGE and METEOR in varying degrees are significantly outperforming most of the existing state-ofthe- art approaches. The experimental results demonstrate the effectiveness and generality of our novel generation framework TSE-AC.

Cite

CITATION STYLE

APA

Wang, Q., Lin, X., & Wang, F. (2022). A Timestep aware Sentence Embedding and Acme Coverage for Brief but Informative Title Generation. In Findings of the Association for Computational Linguistics: NAACL 2022 - Findings (pp. 1906–1918). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-naacl.146

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free