Truncated variational sampling for ‘Black Box’ optimization of generative models

3Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We investigate the optimization of two probabilistic generative models with binary latent variables using a novel variational EM approach. The approach distinguishes itself from previous variational approaches by using latent states as variational parameters. Here we use efficient and general purpose sampling procedures to vary the latent states, and investigate the “black box” applicability of the resulting optimization approach. For general purpose applicability, samples are drawn from approximate marginal distributions as well as from the prior distribution of the considered generative model. As such, sampling is defined in a generic form with no analytical derivations required. As a proof of concept, we then apply the novel procedure (A) to Binary Sparse Coding (a model with continuous observables), and (B) to basic Sigmoid Belief Networks (which are models with binary observables). Numerical experiments verify that the investigated approach efficiently as well as effectively increases a variational free energy objective without requiring any additional analytical steps.

Cite

CITATION STYLE

APA

Lücke, J., Dai, Z., & Exarchakis, G. (2018). Truncated variational sampling for ‘Black Box’ optimization of generative models. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10891 LNCS, pp. 467–478). Springer Verlag. https://doi.org/10.1007/978-3-319-93764-9_43

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free