Over the last few decades we have experienced a plethora of successful optimization concepts, algorithms, techniques and softwares. Each trying to excel in its own niche. Logically, combining a carefully selected subset of them may deliver a novel approach that brings together the best of some those previously independent worlds. The span of applicability of the new approach and the magnitude of improvement are completely dependent on the selected techniques and the level of perfection in weaving them together. In this study, we combine NSGA-III with local search and use the recently proposed Karush-Kuhn-Tucker Proximity Measure (KKTPM) to guide the whole process. These three carefully selected building blocks are intended to perform well on several levels. Here, we focus on Diversity and Convergence (DC-NSGA-III), hence we use Local Search and KKTPM respectively, in the course of a multi/many objective algorithm (NSGA-III). The results show how DC-NSGA-III can significantly improve performance on several standard multi- and many-objective optimization problems.
CITATION STYLE
Seada, H., Abouhawwash, M., & Deb, K. (2017). Towards a better balance of diversity and convergence in NSGA-III: First results. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10173 LNCS, pp. 545–559). Springer Verlag. https://doi.org/10.1007/978-3-319-54157-0_37
Mendeley helps you to discover research relevant for your work.