Generating sentences using a dynamic canvas

3Citations
Citations of this article
41Readers
Mendeley users who have this article in their library.

Abstract

We introduce the Attentive Unsupervised Text (W)riter (AUTR), which is a word level generative model for natural language. It uses a recurrent neural network with a dynamic attention and canvas memory mechanism to iteratively construct sentences. By viewing the state of the memory at intermediate stages and where the model is placing its attention, we gain insight into how it constructs sentences. We demonstrate that AUTR learns a meaningful latent representation for each sentence, and achieves competitive log-likelihood lower bounds whilst being computationally efficient. It is effective at generating and reconstructing sentences, as well as imputing missing words.

Cite

CITATION STYLE

APA

Shah, H., Zheng, B., & Barber, D. (2018). Generating sentences using a dynamic canvas. In 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 5430–5437). AAAI press. https://doi.org/10.1609/aaai.v32i1.11951

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free