Supplementary Material for “Variational Inference and Model Selection with Generalized Evidence Bounds”

  • Chen L
  • Tao C
  • Zhang R
  • et al.
ISSN: 1938-7228
N/ACitations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

Recent advances on the scalability and flexibility of variational inference have made it successful at unravelling hidden patterns in complex data. In this work we propose a new variational bound formulation, yielding an estimator that extends beyond the conventional variational bound. It naturally subsumes the importance-weighted and Renyi bounds as special cases, and it is provably sharper than these counterparts. We also present an improved estimator for variational learning, and advocate a novel high signal-to-variance ratio update rule for the variational parameters. We discuss model-selection issues associated with existing evidence-lower-bound-based variational inference procedures, and show how to leverage the flexibility of our new formulation to address them. Empirical evidence is provided to validate our claims.

Cite

CITATION STYLE

APA

Chen, L., Tao, C., Zhang, R., Henao, R., & Carin, L. (2018). Supplementary Material for “Variational Inference and Model Selection with Generalized Evidence Bounds.” Icml2018, 1(Icml), 892–901. Retrieved from http://proceedings.mlr.press/v80/chen18k.html

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free