Multi-label dimensionality reduction is an appealing and challenging task in data mining and machine learning. Previous works on multi-label dimensionality reduction mainly conduct in an unsupervised or supervised way, and ignore abundant unlabeled samples. In addition, most of them emphasize on using pairwise correlations between samples, therefore, unable to utilize the high-order sample information to improve the performance. To address these challenges, we propose an approach called Semi-supervised Multi-label Dimensionality Reduction via Low Rank Representation (SMLD-LRR). SMLD-LRR first utilizes the low rank representation in the feature space of samples to calculate the low rank constrained coefficient matrix, then it adapts the coefficient matrix to capture the high-order structure of samples. Next, it uses low rank representation in the label space of labeled samples to explore the global correlations of labels. After that, SMLD-LRR further employs the learned high-order structure of samples to enforce the consistency between samples in the original space and the corresponding samples in the projected subspace by maximizing the dependence between them. Finally, these two high-order correlations and the dependence term are incorporated into the multi-label linear discriminant analysis for dimensionality reduction. Extensive experimental results on four multi-label datasets demonstrate that SMLD-LRR achieves better performance than other competitive methods across various evaluation criteria; it also can effectively exploit high-order label correlations to preserve sample structure in the projected subspace.
CITATION STYLE
Liu, Y. (2018). Semi-supervised multi-label dimensionality reduction via low rank representation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11303 LNCS, pp. 625–637). Springer Verlag. https://doi.org/10.1007/978-3-030-04182-3_55
Mendeley helps you to discover research relevant for your work.