GeNAS: Neural Architecture Search with Better Generalization

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Neural Architecture Search (NAS) aims to automatically excavate the optimal network architecture with superior test performance. Recent neural architecture search (NAS) approaches rely on validation loss or accuracy to find the superior network for the target data. In this paper, we investigate a new neural architecture search measure for excavating architectures with better generalization. We demonstrate that the flatness of the loss surface can be a promising proxy for predicting the generalization capability of neural network architectures. We evaluate our proposed method on various search spaces, showing similar or even better performance compared to the state-of-the-art NAS methods. Notably, the resultant architecture found by flatness measure generalizes robustly to various shifts in data distribution (e.g. ImageNet-V2,-A,-O), as well as various tasks such as object detection and semantic segmentation.

Cite

CITATION STYLE

APA

Jeong, J., Yu, J., Park, G., Han, D., & Yoo, Y. J. (2023). GeNAS: Neural Architecture Search with Better Generalization. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2023-August, pp. 911–919). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2023/101

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free