NSGANetV2: Evolutionary Multi-objective Surrogate-Assisted Neural Architecture Search

95Citations
Citations of this article
137Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we propose an efficient NAS algorithm for generating task-specific models that are competitive under multiple competing objectives. It comprises of two surrogates, one at the architecture level to improve sample efficiency and one at the weights level, through a supernet, to improve gradient descent training efficiency. On standard benchmark datasets (C10, C100, ImageNet), the resulting models, dubbed NSGANetV2, either match or outperform models from existing approaches with the search being orders of magnitude more sample efficient. Furthermore, we demonstrate the effectiveness and versatility of the proposed method on six diverse non-standard datasets, e.g. STL-10, Flowers102, Oxford Pets, FGVC Aircrafts etc. In all cases, NSGANetV2s improve the state-of-the-art (under mobile setting), suggesting that NAS can be a viable alternative to conventional transfer learning approaches in handling diverse scenarios such as small-scale or fine-grained datasets. Code is available at https://github.com/mikelzc1990/nsganetv2.

Cite

CITATION STYLE

APA

Lu, Z., Deb, K., Goodman, E., Banzhaf, W., & Boddeti, V. N. (2020). NSGANetV2: Evolutionary Multi-objective Surrogate-Assisted Neural Architecture Search. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12346 LNCS, pp. 35–51). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-58452-8_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free