DnA: Improving Few-Shot Transfer Learning with Low-Rank Decomposition and Alignment

2Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Self-supervised (SS) learning has achieved remarkable success in learning strong representation for in-domain few-shot and semi-supervised tasks. However, when transferring such representations to downstream tasks with domain shifts, the performance degrades compared to its supervised counterpart, especially at the few-shot regime. In this paper, we proposed to boost the transferability of the self-supervised pre-trained models on cross-domain tasks via a novel self-supervised alignment step on the target domain using only unlabeled data before conducting the downstream supervised fine-tuning. A new reparameterization of the pre-trained weights is also presented to mitigate the potential catastrophic forgetting during the alignment step. It involves low-rank and sparse decomposition, that can elegantly balance between preserving the source domain knowledge without forgetting (via fixing the low-rank subspace), and the extra flexibility to absorb the new out-of-the-domain knowledge (via freeing the sparse residual). Our resultant framework, termed Decomposition-and-Alignment (DnA), significantly improves the few-shot transfer performance of the SS pre-trained model to downstream tasks with domain gaps. (The code is released at https://github.com/VITA-Group/DnA ).

Cite

CITATION STYLE

APA

Jiang, Z., Chen, T., Chen, X., Cheng, Y., Zhou, L., Yuan, L., … Wang, Z. (2022). DnA: Improving Few-Shot Transfer Learning with Low-Rank Decomposition and Alignment. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13680 LNCS, pp. 239–256). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-20044-1_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free