Fast hybrid algorithm for big matrix recovery

0Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

Large-scale Nuclear Norm penalized Least Square problem (NNLS) is frequently encountered in estimation of low rank structures. In this paper we accelerate the solution procedure by combining non-smooth convex optimization with smooth Riemannian method. Our methods comprise of two phases. In the first phase, we use Alternating Direction Method of Multipliers (ADMM) both to identify the fix rank manifold where an optimum resides and to provide an initializer for the subsequent refinement. In the second phase, two superlinearly convergent Riemannian methods: Riemannian Newton (NT) and Riemannian Conjugate Gradient descent (CG) are adopted to improve the approximation over a fix rank manifold. We prove that our Hybrid method of ADMM and NT (HADMNT) converges to an optimum of NNLS at least quadratically. The experiments on large-scale collaborative filtering datasets demonstrate very competitive performance of these fast hybrid methods compared to the state-of-The-Arts.

Cite

CITATION STYLE

APA

Zhou, T., Qian, H., Shen, Z., & Xu, C. (2016). Fast hybrid algorithm for big matrix recovery. In 30th AAAI Conference on Artificial Intelligence, AAAI 2016 (pp. 1444–1450). AAAI press. https://doi.org/10.1609/aaai.v30i1.10161

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free