Neural diverse abstractive sentence compression generation

5Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this work, we have contributed a novel abstractive sentence compression model which generates diverse compressed sentence with paraphrase using a neural seq2seq encoder decoder model. We impose several operations in order to generate diverse abstractive compressions at the sentence level which was not addressed in the past research works. Our model jointly improves the information coverage and abstractiveness of the generated sentences. We conduct our experiments on the human-generated abstractive sentence compression datasets and evaluate our system on several newly proposed Machine Translation (MT) evaluation metrics. Our experiments demonstrate that the methods bring significant improvements over the state-of-the-art methods across different metrics.

Cite

CITATION STYLE

APA

Nayeem, M. T., Fuad, T. A., & Chali, Y. (2019). Neural diverse abstractive sentence compression generation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11438 LNCS, pp. 109–116). Springer Verlag. https://doi.org/10.1007/978-3-030-15719-7_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free