The support vector machine (SVM), one of the most effective learning algorithms, has many real-world applications. The kernel type and its parameters have a significant impact on the SVM algorithm's effectiveness and performance. In machine learning, choosing the feature subset is a crucial step, especially when working with high-dimensional data sets. These crucial criteria were treated independently in the majority of earlier studies. In this research, we suggest a hybrid strategy based on the Harris Hawk optimization (HHO) algorithm. HHO is one of the lately suggested metaheuristic algorithms that has been demonstrated to be used more efficiently in facing some optimization problems. The suggested method optimizes the SVM model parameters while also locating the optimal features subset. We ran the proposed approach HHO-SVM on real biomedical datasets with 17 types of cancer for Iraqi patients in 2010-2012. The experimental results demonstrate the supremacy of the proposed HHO-SVM in terms of three performance metrics: feature selection accuracy, runtime, and number of selected features. The suggested method is contrasted with four well-known algorithms for verification: firefly (FF) algorithm, genetic algorithm (GA), grasshopper optimization algorithm (GOA), and particle swarm algorithm (PSO). The implementation of the proposed HHO-SVM approach reveals 99.967% average accuracy.
CITATION STYLE
Ibrahim, H. T., Mazher, W. J., & Jassim, E. M. (2023). Modified Harris Hawks optimizer for feature selection and support vector machine kernels. Indonesian Journal of Electrical Engineering and Computer Science, 29(2), 942–953. https://doi.org/10.11591/ijeecs.v29.i2.pp942-953
Mendeley helps you to discover research relevant for your work.