Adapting a model to a handful of personalized data is challenging, especially when it has gigantic parameters, such as a Transformer-based pretrained model. The standard way of fine-tuning all the parameters necessitates storing a huge model for each user. In this work, we introduce a lightweight approach dubbed UserAdapter, which clamps hundred millions of parameters of the Transformer model and optimizes a tiny user-specific vector. We take sentiment analysis as a test bed, and collect datasets of reviews from Yelp and IMDB respectively. Results show that, on both datasets, UserAdapter achieves better accuracy than the standard fine-tuned Transformer-based pre-trained model. More importantly, UserAdapter offers an efficient way to produce a personalized Transformer model with less than 0.5% parameters added for each user.
CITATION STYLE
Zhong, W., Tang, D., Wang, J., Yin, J., & Duan, N. (2021). UserAdapter: Few-Shot User Learning in Sentiment Analysis. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 1484–1488). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.129
Mendeley helps you to discover research relevant for your work.