An expectation-maximization approach to tuning generalized vector approximate message passing

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Generalized Vector Approximate Message Passing (GVAMP) is an efficient iterative algorithm for approximately minimum-mean-squared-error estimation of a random vector x~px(x) from generalized linear measurements, i.e., measurements of the form y=Q(z) where z = Ax with known A, and Q(·) is a noisy, potentially nonlinear, componentwise function. Problems of this form show up in numerous applications, including robust regression, binary classification, quantized compressive sensing, and phase retrieval. In some cases, the prior p(x) and/or channel Q(·) depend on unknown deterministic parameters θ, which prevents a direct application of GVAMP. In this paper we propose a way to combine expectation maximization (EM) with GVAMP to jointly estimate x and θ. We then demonstrate how EM-GVAMP can solve the phase retrieval problem with unknown measurement-noise variance.

Cite

CITATION STYLE

APA

Metzler, C. A., Schniter, P., & Baraniuk, R. G. (2018). An expectation-maximization approach to tuning generalized vector approximate message passing. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10891 LNCS, pp. 395–406). Springer Verlag. https://doi.org/10.1007/978-3-319-93764-9_37

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free