Feature Selection Algorithms: A Comparative Study

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Machine Learning models are built on top of features, all the features present in data may not contribute towards building robust model. Further as the number of features becomes large, model becomes complicated resulting in curse of dimensionality. Decrease in the number of features helps in the both time and space complexity towards model development. Here, various feature selection methods like Sequential Forward Selection (SFS), Backward Elimination (BE), Recursive Feature Elimination (RFE), correlation, one-way ANOVA test and hybrid methods are adopted. The present study make use of XGBoost method to build the model. Using feature selection techniques there is significant change in the performance of the model.

Cite

CITATION STYLE

APA

Srinivas, P., Guggari, S., Darapaneni, N., Paduri, A. R., & Sudha, B. G. (2023). Feature Selection Algorithms: A Comparative Study. In Lecture Notes in Networks and Systems (Vol. 648 LNNS, pp. 402–412). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-27524-1_38

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free