Extended Stochastic Coati Optimizer

32Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

A new metaheuristic can be developed by constructing from scratch, modifying the existing metaheuristics, or hybridizing some metaheuristics. This work presents a new metaheuristic: extended stochastic coati optimizer (ESCO). ESCO is developed by expanding the shortcoming coati optimization algorithm (COA). ESCO expands the number of searches and references used in COA. ESCO also implements a stochastic process for each unit to choose the searches that will perform. It differs from COA, which splits the population into two fixed groups, each performing its strategy. ESCO implements three sequential phases in every iteration. Two options can be chosen in every phase. ESCO has three references in its guided search: the global best unit, a randomly selected unit, and a randomized unit within the search space. In this work, ESCO is challenged to solve 23 classic functions and benchmarked with five shortcoming metaheuristics: guided pelican algorithm (GPA), puzzle optimization algorithm (POA), average subtraction-based optimizer (ASBO), and coati optimization algorithm (COA). The result presents the superiority of ESCO among five shortcoming metaheuristics by outperforming the GPA, POA, GSO, ASBO, and COA in solving 13, 21, 23, 16, and 13 functions, respectively. Through investigation, the multiple search approach is more effective than the single search approach

Cite

CITATION STYLE

APA

Kusuma, P. D., & Dinimaharawati, A. (2023). Extended Stochastic Coati Optimizer. International Journal of Intelligent Engineering and Systems, 16(3), 482–494. https://doi.org/10.22266/ijies2023.0630.38

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free