Interpreting and Exploiting Functional Specialization in Multi-Head Attention under Multi-task Learning

4Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Transformer-based models, even though achieving super-human performance on several downstream tasks, are often regarded as a black box and used as a whole. It is still unclear what mechanisms they have learned, especially their core module: multi-head attention. Inspired by functional specialization in the human brain, which helps to efficiently handle multiple tasks, this work attempts to figure out whether the multi-head attention module will evolve similar function separation under multitasking training. If it is, can this mechanism further improve the model performance? To investigate these questions, we introduce an interpreting method to quantify the degree of functional specialization in multi-head attention. We further propose a simple multi-task training method to increase functional specialization and mitigate negative information transfer in multi-task learning. Experimental results on seven pre-trained transformer models have demonstrated that multi-head attention does evolve functional specialization phenomenon after multi-task training which is affected by the similarity of tasks. Moreover, the multi-task training strategy based on functional specialization boosts performance in both multi-task learning and transfer learning without adding any parameters..

Cite

CITATION STYLE

APA

Li, C., Wang, S., Zhang, Y., Zhang, J., & Zong, C. (2023). Interpreting and Exploiting Functional Specialization in Multi-Head Attention under Multi-task Learning. In EMNLP 2023 - 2023 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 16460–16476). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.emnlp-main.1026

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free