Designing neural architectures requires immense manual efforts. This has promoted the development of neural architecture search (NAS) to automate the design. While previous NAS methods achieve promising results but run slowly, zero-cost proxies run extremely fast but are less promising. Therefore, it's of great potential to accelerate NAS via those zero-cost proxies. The existing method has two limitations, which are unforeseeable reliability and one-shot usage. To address the limitations, we present ProxyBO, an efficient Bayesian optimization (BO) framework that utilizes the zero-cost proxies to accelerate neural architecture search. We apply the generalization ability measurement to estimate the fitness of proxies on the task during each iteration and design a novel acquisition function to combine BO with zero-cost proxies based on their dynamic influence. Extensive empirical studies show that ProxyBO consistently outperforms competitive baselines on five tasks from three public benchmarks. Concretely, ProxyBO achieves up to 5.41× and 3.86× speedups over the state-ofthe-art approaches REA and BRP-NAS.
CITATION STYLE
Shen, Y., Li, Y., Zheng, J., Zhang, W., Yao, P., Li, J., … Cui, B. (2023). ProxyBO: Accelerating Neural Architecture Search via Bayesian Optimization with Zero-Cost Proxies. In Proceedings of the 37th AAAI Conference on Artificial Intelligence, AAAI 2023 (Vol. 37, pp. 9792–9801). AAAI Press. https://doi.org/10.1609/aaai.v37i8.26169
Mendeley helps you to discover research relevant for your work.