Reparameterizable subset sampling via continuous relaxations

59Citations
Citations of this article
86Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Many machine learning tasks require sampling a subset of items from a collection based on a parameterized distribution. The Gumbel-softmax trick can be used to sample a single item, and allows for low-variance reparameterized gradients with respect to the parameters of the underlying distribution. However, stochastic optimization involving subset sampling is typically not reparameterizable. To overcome this limitation, we define a continuous relaxation of subset sampling that provides reparameterization gradients by generalizing the Gumbel-max trick. We use this approach to sample subsets of features in an instance-wise feature selection task for model interpretability, subsets of neighbors to implement a deep stochastic k-nearest neighbors model, and sub-sequences of neighbors to implement parametric t-SNE by directly comparing the identities of local neighbors. We improve performance in all these tasks by incorporating subset sampling in end-to-end training.

Cite

CITATION STYLE

APA

Xie, S. M., & Ermon, S. (2019). Reparameterizable subset sampling via continuous relaxations. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 3919–3925). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/544

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free