NLRR++: Scalable subspace clustering via non-convex block coordinate descent

0Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Low-Rank Representation (LRR) is one of the most effective algorithms for subspace clustering. Exploring the multiple subspace structures of data such as Low-Rank Representation is effective in subspace clustering. Low-Rank Representation is usually formulated as a constrained convex optimization problem regularized with nuclear norm. It needs singular values decomposition (SVD) in every iteration which is challenging in terms of time complexity and memory. Recently, its non-convex formulation, NLRR [1] via matrix factorization framework has been proposed, and becomes one of state-of-the-art techniques for subspace clustering. However, NLRR cannot scale to problems with large n (number of samples) since it requires either the inversion of an n × n matrix or solving an n × n linear system. In this paper, we develop a faster algorithm for solving NLRR problem, with time complexity per iteration reduced from O(n3) to O(mnd) and memory complexity from O(n2) to O(mn), where m is dimensionality and d is the target rank (usually d ≫ m ≫ n). The main idea of our algorithm is to reformulate NLRR as a sum of rank-one components and apply a column-wise block coordinate descent to update each component iteratively. Our new method is considerably faster and more accurate in practice for subspace clustering. We also show that the proposed method is guaranteed to converge to stationary points. Moreover, considering the high demand in memory and computational time for the final spectral clustering phase, we also propose an efficient clustering approach which can further boost the performance of subspace clustering. Experiments on extensive simulations and real datasets conform the efficiency of our proposed NLRR++. In particular, we are more than 12 times faster than state-of-the-art subspace clustering method, NLRR, and our method is the only subspace clustering algorithm that can scale to the Imagenet dataset with 120K samples.

Cite

CITATION STYLE

APA

Wang, J., Hsieh, C. J., & Shi, D. (2018). NLRR++: Scalable subspace clustering via non-convex block coordinate descent. In SIAM International Conference on Data Mining, SDM 2018 (pp. 28–36). Society for Industrial and Applied Mathematics Publications. https://doi.org/10.1137/1.9781611975321.4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free