Robust feature selection via nonconvex sparsity-based methods

62Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we propose a new model for supervised multiclass feature selection which has the l2,1-norm in both the fidelity loss and the regularization terms with an additional l2,0-constraint. This problem is challenging for applying available optimization methods because of the discontinuous and nonconvex nature of the l2,0-norm. We first convert the constraint defined by the l2,0-norm into a new constraint defined by a difference of two matrix norms. Then we reformulate the problem as an unconstrained problem using the exact penalty method. Based on a derived formula for the proximal mapping of this difference of matrix norms and Nesterov's smoothing techniques, the nonmonotonic accelerated proximal gradient method is applied to solve the unconstrained problem. Numerical experiments are conducted on many benchmark data sets to show the effectiveness of our proposed method in comparison with existing methods.

Cite

CITATION STYLE

APA

An, N. T., Dong, P. D., & Qin, X. (2021). Robust feature selection via nonconvex sparsity-based methods. Journal of Nonlinear and Variational Analysis, 5(1), 59–77. https://doi.org/10.23952/JNVA.5.2021.1.05

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free