Keeping notes: Conditional natural language generation with a scratchpad mechanism

5Citations
Citations of this article
160Readers
Mendeley users who have this article in their library.

Abstract

We introduce the Scratchpad Mechanism, a novel addition to the sequence-to-sequence (seq2seq) neural network architecture and demonstrate its effectiveness in improving the overall fluency of seq2seq models for natural language generation tasks. By enabling the decoder at each time step to write to all of the encoder output layers, Scratchpad can employ the encoder as a “scratchpad” memory to keep track of what has been generated so far and thereby guide future generation. We evaluate Scratchpad in the context of three well-studied natural language generation tasks - Machine Translation, Question Generation, and Text Summarization - and obtain state-of-the-art or comparable performance on standard datasets for each task. Qualitative assessments in the form of human judgements (question generation), attention visualization (MT), and sample output (summarization) provide further evidence of the ability of Scratchpad to generate fluent and expressive output.

Cite

CITATION STYLE

APA

Benmalek, R. Y., Khabsa, M., Desu, S., Cardie, C., & Banko, M. (2020). Keeping notes: Conditional natural language generation with a scratchpad mechanism. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 4157–4167). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1407

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free