A comparative study between feature selection algorithms

4Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we show a comparative study between four algorithms used in features selection; these are: decision trees, entropy measure for ranking features, estimation of distribution algorithms, and the bootstrapping algorithm. Likewise, the features selection is highlighted as the most representative task in the elimination of noise, in order to improve the quality of the dataset. Subsequently, each algorithm is described in order that the reader understands its function. Then the algorithms are applied using different data sets and obtaining the results in the selection. Finally, the conclusions of this investigation are presented.

Cite

CITATION STYLE

APA

Medina Garcia, V. H., Rodriguez Rodriguez, J., & Ospina Usaquén, M. A. (2018). A comparative study between feature selection algorithms. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10943 LNCS, pp. 65–76). Springer Verlag. https://doi.org/10.1007/978-3-319-93803-5_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free