ℓ2,1-norm minimization for Unsupervised Feature Selection from Incomplete Data

3Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Unsupervised feature selection (UFS) has been widely used in many machine learning applications, since it can alleviate burdens on the curse of high dimensional data. However, existing works design the UFS methods with the assumption that the complete information of dataset can be observed. In this paper, we propose a UFS method from incomplete data by further introducing ℓ2,1-norm minimization. In order to exploit feature relevance for UFS, we show that ℓ2,1-norm minimization should be elaborately dealt with unobserved information, which can be finely described with the use of an indicator matrix. The proposed ℓ2,1-norm minimization UFS from incomplete data (LUFS) can be directly solved by a convex optimization problem, and we design an alternative algorithm to optimize the convex problem, where the convergence of LUFS algorithm is also proved theoretically and experimentally. Empirical results demonstrate the effectiveness of LUFS with the state-of-the-art methods on real incomplete datasets.

Cite

CITATION STYLE

APA

Fan, L., Wu, X., Tong, W., & Zeng, W. (2021). ℓ2,1-norm minimization for Unsupervised Feature Selection from Incomplete Data. In 2021 7th International Conference on Computer and Communications, ICCC 2021 (pp. 1491–1495). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/ICCC54389.2021.9674464

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free