Pruning Meta-Trained Networks for On-Device Adaptation

8Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Adapting neural networks to unseen tasks with few training samples on resource-constrained devices benefits various Internet-of-Things applications. Such neural networks should learn the new tasks in few shots and be compact in size. Meta-learning enables few-shot learning, yet the meta-trained networks can be over-parameterised. However, naive combination of standard compression techniques like network pruning with meta-learning jeopardises the ability for fast adaptation. In this work, we propose adaptation-aware network pruning (ANP), a novel pruning scheme that works with existing meta-learning methods for a compact network capable of fast adaptation. ANP uses weight importance metric that is based on the sensitivity of the meta-objective rather than the conventional loss function, and adopts approximation of derivatives and layer-wise pruning techniques to reduce the overhead of computing the new importance metric. Evaluations on few-shot classification benchmarks show that ANP can prune meta-trained convolutional and residual networks by 85% without affecting their fast adaptation.

Cite

CITATION STYLE

APA

Gao, D., He, X., Zhou, Z., Tong, Y., & Thiele, L. (2021). Pruning Meta-Trained Networks for On-Device Adaptation. In International Conference on Information and Knowledge Management, Proceedings (pp. 514–523). Association for Computing Machinery. https://doi.org/10.1145/3459637.3482378

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free