Ensembles of Networks Produced from Neural Architecture Search

4Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Neural architecture search (NAS) is a popular topic at the intersection of deep learning and high performance computing. NAS focuses on optimizing the architecture of neural networks along with their hyperparameters in order to produce networks with superior performance. Much of the focus has been on how to produce a single best network to solve a machine learning problem, but as NAS methods produce many networks that work very well, this affords the opportunity to ensemble these networks to produce an improved result. Additionally, the diversity of network structures produced by NAS drives a natural bias towards diversity of predictions produced by the individual networks. This results in an improved ensemble over simply creating an ensemble that contains duplicates of the best network architecture retrained to have unique weights.

Cite

CITATION STYLE

APA

Herron, E. J., Young, S. R., & Potok, T. E. (2020). Ensembles of Networks Produced from Neural Architecture Search. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12321 LNCS, pp. 223–234). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-59851-8_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free