Feature selection for partial least square based dimension reduction

12Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this chapter, we will introduce our recent works on feature selection for Partial Least Square based Dimension Reduction (PLSDR). Some previous works of PLSDR, have performed well on bio-medical and chemical data sets, but there are still some problems, like how to determine the number of principle components and how to remove the irrelevant and redundant features for PLSDR. Firstly, we propose a general framework to describe how to perform feature selection for dimension reduction methods, which contains the preprocessing step of irrelevant and redundant feature selection and the postprocessing step of selection of principle components. Secondly, to give an example, we try to handle these problems in the case of PLSDR: 1) we discuss how to determine the top number of features for PLSDR; 2) we propose to remove irrelevant features for PLSDR by using an efficient algorithmof feature probes; 3) we investigate an supervised solution to remove redundant features; 4) we study on whether the top features are important to classification and how to select the most discriminant principal components. The above proposed algorithms are evaluated on several benchmark microarray data sets and show satisfied performance. © 2009 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Li, G. Z., & Zeng, X. Q. (2009). Feature selection for partial least square based dimension reduction. Studies in Computational Intelligence, 205, 3–37. https://doi.org/10.1007/978-3-642-01536-6_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free