TopNet: Learning from Neural Topic Model to Generate Long Stories

7Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Long story generation (LSG) is one of the coveted goals in natural language processing. Different from most text generation tasks, LSG requires to output a long story of rich content based on a much shorter text input, and often suffers from information sparsity. In this paper, we propose TopNet to alleviate this problem, by leveraging the recent advances in neural topic modeling to obtain high-quality skeleton words to complement the short input. In particular, instead of directly generating a story, we first learn to map the short text input to a low-dimensional topic distribution (which is pre-assigned by a topic model). Based on this latent topic distribution, we can use the reconstruction decoder of the topic model to sample a sequence of inter-related words as a skeleton for the story. Experiments on two benchmark datasets show that our proposed framework is highly effective in skeleton word selection and significantly outperforms the state-of-the-art models in both automatic evaluation and human evaluation.

Cite

CITATION STYLE

APA

Yang, Y., Pan, B., Cai, D., & Sun, H. (2021). TopNet: Learning from Neural Topic Model to Generate Long Stories. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 1997–2005). Association for Computing Machinery. https://doi.org/10.1145/3447548.3467410

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free