S-lstm-gan: shared recurrent neural networks with adversarial training

13Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we propose a new architecture Shared-LSTM Generative Adversarial Network (S-LSTM-GAN) that works on recurrent neural networks (RNNs) via an adversarial process and we apply it by training it on the handwritten digit database. We have successfully trained the network for the generator task of handwritten digit generation and the discriminator task of its classification. We demonstrate the potential of this architecture through conditional and quantifiable evaluation of its generated samples.

Cite

CITATION STYLE

APA

Adate, A., & Tripathy, B. K. (2019). S-lstm-gan: shared recurrent neural networks with adversarial training. In Advances in Intelligent Systems and Computing (Vol. 828, pp. 107–115). Springer Verlag. https://doi.org/10.1007/978-981-13-1610-4_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free