An efficient binary Gradient-based optimizer for feature selection

56Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Feature selection (FS) is a classic and challenging optimization task in the field of machine learning and data mining. Gradient-based optimizer (GBO) is a recently developed metaheuristic with population-based characteristics inspired by gradient-based Newton's method that uses two main operators: the gradient search rule (GSR), the local escape operator (LEO) and a set of vectors to explore the search space for solving continuous problems. This article presents a binary GBO (BGBO) algorithm and for feature selecting problems. The eight independent GBO variants are proposed, and eight transfer functions divided into two families of S-shaped and V-shaped are evaluated to map the search space to a discrete space of research. To verify the performance of the proposed binary GBO algorithm, 18 well-known UCI datasets and 10 high-dimensional datasets are tested and compared with other advanced FS methods. The experimental results show that among the proposed binary GBO algorithms has the best comprehensive performance and has better performance than other well known metaheuristic algorithms in terms of the performance measures.

Cite

CITATION STYLE

APA

Jiang, Y., Luo, Q., Wei, Y., Abualigah, L., & Zhou, Y. (2021). An efficient binary Gradient-based optimizer for feature selection. Mathematical Biosciences and Engineering, 18(4), 3813–3854. https://doi.org/10.3934/mbe.2021192

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free