Analysis on Improving the Performance of Machine Learning Models Using Feature Selection Technique

19Citations
Citations of this article
46Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Many organizations deploying computer networks are susceptible to different kinds of attacks in the current era. These attacks compromise the confidentiality, integrity and availability of network systems. It is a big challenge to build a reliable network as several new attacks are being introduced by the attackers. The aim of this paper is to improve the performance of the various machine learning algorithms such as KNN, Decision Tree, Random Forest, Bagging Meta Estimator and XGBoost by utilizing feature importance technique. These classifiers are chosen as they perform superior to other base and ensemble machine learning techniques after feature selection. Feature Importance technique is utilized to obtain the highest ranked features. Reduced attributes improve the accuracy as well as decrease the computation time and prediction time. The experimental results on UNSW-NB dataset show that there is a drastic decrease in the computation time with reduced attributes compared to evaluating the model using the dataset with the entire set of attributes.

Cite

CITATION STYLE

APA

Khan, N. M., Madhav C, N., Negi, A., & Thaseen, I. S. (2020). Analysis on Improving the Performance of Machine Learning Models Using Feature Selection Technique. In Advances in Intelligent Systems and Computing (Vol. 941, pp. 69–77). Springer Verlag. https://doi.org/10.1007/978-3-030-16660-1_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free