Efficient automatic meta optimization search for few-shot learning

0Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Previous works on meta-learning either relied on elaborately hand-designed network structures or adopted specialized learning rules to a particular domain. We propose a universal framework to optimize the meta-learning process automatically by adopting neural architecture search technique NAS. NAS automatically generates and evaluates meta-learner’s architecture for few-shot learning problems, while the meta-learner uses meta-learning algorithm to optimize its parameters based on the distribution of learning tasks. Parameter sharing and experience replay are adopted to accelerate the architectures searching process, so it takes only 1-2 GPU days to find good architectures. Extensive experiments on Mini-ImageNet and Omniglot show that our algorithm excels in few-shot learning tasks. The best architecture found on Mini-ImageNet achieves competitive results when transferred to Omniglot, which shows the high transferability of architectures among different computer vision problems.

Cite

CITATION STYLE

APA

Zheng, X., Wang, P., Wang, Q., Shi, Z., & Xu, F. (2019). Efficient automatic meta optimization search for few-shot learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11859 LNCS, pp. 223–234). Springer. https://doi.org/10.1007/978-3-030-31726-3_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free