Surrogate-assisted evolutionary algorithms (SAEAs) have performed well on low- and medium-scale expensive optimization problems (EOPs). However, with the dimensionality increasing, existing SAEAs have trouble getting reliable surrogates for solving large-scale EOPs. In this paper, we propose a progressive sampling surrogate-assisted particle swarm optimization (PS-SAPSO) to efficiently solve the large-scale EOPs from the perspective of data collection and model training. For the data collection, a progressive sampling strategy with restart operation is proposed to collect the new sample solutions during the evolution process for training the radial basis function network (RBFN) surrogate. Specifically, a social learning particle swarm optimization is employed to generate the new sample solutions under the control of a progressively varying stop criterion. For the model training, a dynamic tunning strategy is proposed to obtain a reliable RBFN by adaptively adjusting the hyperparameter setting during the evolution process. The experimental result shows that PS-SAPSO can achieve competitive or better performance compared with four state-of-the-art SAEAs on widely used benchmark functions. Moreover, ablation experiments are conducted to show the effectiveness of the components of the PS-SAPSO algorithm.
CITATION STYLE
Wang, H. R., Chen, C. H., Li, Y., Zhang, J., & Zhi-Hui-Zhan. (2022). Progressive sampling surrogate-assisted particle swarm optimization for large-scale expensive optimization. In GECCO 2022 - Proceedings of the 2022 Genetic and Evolutionary Computation Conference (pp. 40–48). Association for Computing Machinery, Inc. https://doi.org/10.1145/3512290.3528710
Mendeley helps you to discover research relevant for your work.