Posterior control of blackbox generation

17Citations
Citations of this article
144Readers
Mendeley users who have this article in their library.

Abstract

Text generation often requires high-precision output that obeys task-specific rules. This fine-grained control is difficult to enforce with off-the-shelf deep learning models. In this work, we consider augmenting neural generation models with discrete control states learned through a structured latent-variable approach. Under this formulation, task-specific knowledge can be encoded through a range of rich, posterior constraints that are effectively trained into the model. This approach allows users to ground internal model decisions based on prior knowledge, without sacrificing the representational power of neural generative models. Experiments consider applications of this approach for text generation. We find that this method improves over standard benchmarks, while also providing fine-grained control.

Cite

CITATION STYLE

APA

Li, X. L., & Rush, A. M. (2020). Posterior control of blackbox generation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 2731–2743). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.243

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free