SI-VDNAS: Semi-implicit variational dropout for hierarchical one-shot neural architecture search

4Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Bayesian methods have improved the interpretability and stability of neural architecture search (NAS). In this paper, we propose a novel probabilistic approach, namely Semi-Implicit Variational Dropout one-shot Neural Architecture Search (SI-VDNAS), that leverages semi-implicit variational dropout to support architecture search with variable operations and edges. SI-VDNAS achieves stable training that would not be affected by the over-selection of skip-connect operation. Experimental results demonstrate that SI-VDNAS finds a convergent architecture with only 2.7 MB parameters within 0.8 GPU-days and can achieve 2.60% top-1 error rate on CIFAR-10. The convergent architecture can obtain a top-1 error rate of 16.20% and 25.6% when transferred to CIFAR-100 and ImageNet (mobile setting).

Cite

CITATION STYLE

APA

Wang, Y., Dai, W., Li, C., Zou, J., & Xiong, H. (2020). SI-VDNAS: Semi-implicit variational dropout for hierarchical one-shot neural architecture search. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 2088–2095). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/289

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free