As an essential step of metaheuristic optimizers, initialization seriously affects the convergence speed and solution accuracy. The main motivation of the state-of-the-art initialization method is to generate a small initial population to cover the search space as much as possible uniformly. However, these approaches have suffered from the curse of dimensionality, high computational cost, and sensitivity to parameters, which ultimately reduce the algorithm's convergence speed. In this paper, a new initialization technique named diagonal linear uniform initialization (DLU) is proposed, which follows a novel search view, i.e., adopting the diagonal subspace sampling instead of the whole space. By considering the algorithm's update mechanism, the improved sampling method dramatically improves the convergence speed and solution accuracy of metaheuristic algorithms. Compared with the other eight widely used initialization strategies, the differential evolution (DE) algorithm with DLU obtains the best performance in search accuracy and convergence speed. In the extension experiments, results show that the DLU is still effective for three swarm-based algorithms: particle swarm optimization (PSO), cuckoo search (CS), and artificial bee colony (ABC). Especially for the multi-objective problem, the DLU still demonstrates its powerful performance compared with other strategies.
CITATION STYLE
Li, Q., Bai, Y., & Gao, W. (2021). Improved Initialization Method for Metaheuristic Algorithms: A Novel Search Space View. IEEE Access, 9, 121366–121384. https://doi.org/10.1109/ACCESS.2021.3073480
Mendeley helps you to discover research relevant for your work.