Efficiently Tuned Parameters are Task Embeddings

6Citations
Citations of this article
33Readers
Mendeley users who have this article in their library.

Abstract

Intermediate-task transfer can benefit a wide range of NLP tasks with properly selected source datasets. However, it is computationally infeasible to experiment with all intermediate transfer combinations, making choosing a useful source task a challenging problem. In this paper, we anticipate that task-specific parameters updated in parameter-efficient tuning methods are likely to encode task-specific information. Therefore, such parameters can be predictive for inter-task transferability. Thus, we propose to exploit these efficiently tuned parameters as off-the-shelf task embeddings for the efficient selection of source datasets for intermediate-task transfer. We experiment with 11 text classification tasks and 11 question answering tasks. Experimental results show that our approach can consistently outperform existing inter-task transferability prediction methods while being conceptually simple and computationally efficient. Our analysis also reveals that the ability of efficiently tuned parameters on transferability prediction is disentangled with their in-task performance. This allows us to use parameters from early checkpoints as task embeddings to further improve efficiency.

References Powered by Scopus

SQuad: 100,000+ questions for machine comprehension of text

4049Citations
N/AReaders
Get full text

A large annotated corpus for learning natural language inference

2562Citations
N/AReaders
Get full text

Prefix-tuning: Optimizing continuous prompts for generation

1794Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Connectivity Patterns are Task Embeddings

5Citations
N/AReaders
Get full text

Learning Easily Updated General Purpose Text Representations with Adaptable Task-Specific Prefixes

1Citations
N/AReaders
Get full text

LEMoE: Advanced Mixture of Experts Adaptor for Lifelong Model Editing of Large Language Models

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Zhou, W., Xu, C., & McAuley, J. (2022). Efficiently Tuned Parameters are Task Embeddings. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 (pp. 5007–5014). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.emnlp-main.334

Readers over time

‘22‘23‘24‘2505101520

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 11

73%

Researcher 3

20%

Lecturer / Post doc 1

7%

Readers' Discipline

Tooltip

Computer Science 16

84%

Neuroscience 1

5%

Engineering 1

5%

Medicine and Dentistry 1

5%

Save time finding and organizing research with Mendeley

Sign up for free
0