A Block Decomposition Algorithm for Sparse Optimization

10Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Sparse optimization is a central problem in machine learning and computer vision. However, this problem is inherently NP-hard and thus difficult to solve in general. Combinatorial search methods find the global optimal solution but are confined to small-sized problems, while coordinate descent methods are efficient but often suffer from poor local minima. This paper considers a new block decomposition algorithm that combines the effectiveness of combinatorial search methods and the efficiency of coordinate descent methods. Specifically, we consider a random strategy or/and a greedy strategy to select a subset of coordinates as the working set, and then perform a global combinatorial search over the working set based on the original objective function. We show that our method finds stronger stationary points than Amir Beck et al.'s coordinate-wise optimization method. In addition, we establish the convergence rate of our algorithm. Our experiments on solving sparse regularized and sparsity constrained least squares optimization problems demonstrate that our method achieves state-of-the-art performance in terms of accuracy. For example, our method generally outperforms the well-known greedy pursuit method.

Cite

CITATION STYLE

APA

Yuan, G., Shen, L., & Zheng, W. S. (2020). A Block Decomposition Algorithm for Sparse Optimization. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 275–285). Association for Computing Machinery. https://doi.org/10.1145/3394486.3403070

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free