APrompt: Attention Prompt Tuning for Efficient Adaptation of Pre-trained Language Models

38Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With the continuous growth of large language models, the process of fine-tuning these models for new tasks has become increasingly parameter-intensive. Prompt tuning, a method that involves tuning a small set of soft prompts, has emerged as an effective and efficient approach for adapting large pre-trained language models. However, most existing prompt tuning approaches only introduce prompts at the input layer, limiting their performance and leaving large rooms for improvement. In this work, we propose a novel Attention Prompt tuning method, namely APROMPT, for efficient adaptation of pre-trained language models. We first demonstrate that existing prompt tuning can be considered as a special case of attention prompt tuning. We then formally introduce APROMPT, which incorporates query, key, and value prompts into the attention layer to guide the attention computation during fine-tuning. Experimental results on the SuperGLUE benchmark consistently demonstrate that our proposed approach outperforms state-of-the-art baselines and full fine-tuning method with pretrained models at different scales. In addition, a comprehensive set of ablation studies validate the effectiveness of the prompt design, as well as the efficiency of our approach.

Cite

CITATION STYLE

APA

Wang, Q., Mao, Y., Wang, J., Yu, H., Nie, S., Wang, S., … Liu, D. (2023). APrompt: Attention Prompt Tuning for Efficient Adaptation of Pre-trained Language Models. In EMNLP 2023 - 2023 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 9147–9160). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.emnlp-main.567

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free