Truncated Inference for Latent Variable Optimization Problems: Application to Robust Estimation and Learning

N/ACitations
Citations of this article
36Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Optimization problems with an auxiliary latent variable structure in addition to the main model parameters occur frequently in computer vision and machine learning. The additional latent variables make the underlying optimization task expensive, either in terms of memory (by maintaining the latent variables), or in terms of runtime (repeated exact inference of latent variables). We aim to remove the need to maintain the latent variables and propose two formally justified methods, that dynamically adapt the required accuracy of latent variable inference. These methods have applications in large scale robust estimation and in learning energy-based models from labeled data.

Cite

CITATION STYLE

APA

Zach, C., & Le, H. (2020). Truncated Inference for Latent Variable Optimization Problems: Application to Robust Estimation and Learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12371 LNCS, pp. 464–480). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-58574-7_28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free