Event Representation with Sequential, Semi-Supervised Discrete Variables

11Citations
Citations of this article
69Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Within the context of event modeling and understanding, we propose a new method for neural sequence modeling that takes partially-observed sequences of discrete, external knowledge into account. We construct a sequential neural variational autoencoder, which uses Gumbel-Softmax reparametrization within a carefully defined encoder, to allow for successful backpropagation during training. The core idea is to allow semi-supervised external discrete knowledge to guide, but not restrict, the variational latent parameters during training. Our experiments indicate that our approach not only outperforms multiple baselines and the state-of-the-art in narrative script induction, but also converges more quickly.

Cite

CITATION STYLE

APA

Rezaee, M., & Ferraro, F. (2021). Event Representation with Sequential, Semi-Supervised Discrete Variables. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 4701–4716). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-main.374

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free