Improved WOA and its application in feature selection

36Citations
Citations of this article
28Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Feature selection (FS) can eliminate many redundant, irrelevant, and noisy features in highdimensional data to improve machine learning or data mining models' prediction, classification, and computational performance. We proposed an improved whale optimization algorithm (IWOA) and improved k-nearest neighbors (IKNN) classifier approaches for feature selection (IWOAIKFS). Firstly, WOA is improved by using chaotic elite reverse individual, probability selection of skew distribution, nonlinear adjustment of control parameters and position correction strategy to enhance the search performance of the algorithm for feature subsets. Secondly, the sample similarity measurement criterion and weighted voting criterion based on the simulated annealing algorithm to solve the weight matrix M are proposed to improve the KNN classifier and improve the evaluation performance of the algorithm on feature subsets. The experimental results show: IWOA not only has better optimization performance when solving benchmark functions of different dimensions, but also when used with IKNN for feature selection, IWOAIKFS has better classification and robustness.

Cite

CITATION STYLE

APA

Liu, W., Guo, Z., Jiang, F., Liu, G., Wang, D., & Ni, Z. (2022). Improved WOA and its application in feature selection. PLoS ONE, 17(5 May). https://doi.org/10.1371/journal.pone.0267041

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free