Machine Learning models are built on top of features, all the features present in data may not contribute towards building robust model. Further as the number of features becomes large, model becomes complicated resulting in curse of dimensionality. Decrease in the number of features helps in the both time and space complexity towards model development. Here, various feature selection methods like Sequential Forward Selection (SFS), Backward Elimination (BE), Recursive Feature Elimination (RFE), correlation, one-way ANOVA test and hybrid methods are adopted. The present study make use of XGBoost method to build the model. Using feature selection techniques there is significant change in the performance of the model.
CITATION STYLE
Srinivas, P., Guggari, S., Darapaneni, N., Paduri, A. R., & Sudha, B. G. (2023). Feature Selection Algorithms: A Comparative Study. In Lecture Notes in Networks and Systems (Vol. 648 LNNS, pp. 402–412). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-27524-1_38
Mendeley helps you to discover research relevant for your work.