A Comparative Study of Feature Selection Techniques for Bat Algorithm in Various Applications

11Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

Feature selection is a process to select the best feature among huge number of features in dataset, However, the problem in feature selection is to select a subset that give the better performs under some classifier. In producing better classification result, feature selection been applied in many of the classification works as part of preprocessing step; where only a subset of feature been used rather than the whole features from a particular dataset. This procedure not only can reduce the irrelevant features but in some cases able to increase classification performance due to finite sample size. In this study, Chi-Square (CH), Information Gain (IG) and Bat Algorithm (BA) are used to obtain the subset features on fourteen well-known dataset from various applications. To measure the performance of these selected features three benchmark classifier are used; k-Nearest Neighbor (kNN), Naïve Bayes (NB) and Decision Tree (DT). This paper then analyzes the performance of all classifiers with feature selection in term of accuracy, sensitivity, F-Measure and ROC. The objective of these study is to analyse the outperform feature selection techniques among conventional and heuristic techniques in various applications.

Cite

CITATION STYLE

APA

Mohamed, R., Mohd Yusof, M., & Wahidi, N. (2018). A Comparative Study of Feature Selection Techniques for Bat Algorithm in Various Applications. In MATEC Web of Conferences (Vol. 150). EDP Sciences. https://doi.org/10.1051/matecconf/201815006006

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free