Efficient Transfer Learning via Joint Adaptation of Network Architecture and Weight

2Citations
Citations of this article
47Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Transfer learning can boost the performance on the target task by leveraging the knowledge of the source domain. Recent works in neural architecture search (NAS), especially one-shot NAS, can aid transfer learning by establishing sufficient network search space. However, existing NAS methods tend to approximate huge search spaces by explicitly building giant super-networks with multiple sub-paths, and discard super-network weights after a child structure is found. Both the characteristics of existing approaches causes repetitive network training on source tasks in transfer learning. To remedy the above issues, we reduce the super-network size by randomly dropping connection between network blocks while embedding a larger search space. Moreover, we reuse super-network weights to avoid redundant training by proposing a novel framework consisting of two modules, the neural architecture search module for architecture transfer and the neural weight search module for weight transfer. These two modules conduct search on the target task based on a reduced super-networks, so we only need to train once on the source task. We experiment our framework on both MS-COCO and CUB-200 for the object detection and fine-grained image classification tasks, and show promising improvements with only O(CN) super-network complexity.

Cite

CITATION STYLE

APA

Sun, M., Dou, H., & Yan, J. (2020). Efficient Transfer Learning via Joint Adaptation of Network Architecture and Weight. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12358 LNCS, pp. 463–480). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-58601-0_28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free