UserAdapter: Few-Shot User Learning in Sentiment Analysis

18Citations
Citations of this article
63Readers
Mendeley users who have this article in their library.

Abstract

Adapting a model to a handful of personalized data is challenging, especially when it has gigantic parameters, such as a Transformer-based pretrained model. The standard way of fine-tuning all the parameters necessitates storing a huge model for each user. In this work, we introduce a lightweight approach dubbed UserAdapter, which clamps hundred millions of parameters of the Transformer model and optimizes a tiny user-specific vector. We take sentiment analysis as a test bed, and collect datasets of reviews from Yelp and IMDB respectively. Results show that, on both datasets, UserAdapter achieves better accuracy than the standard fine-tuned Transformer-based pre-trained model. More importantly, UserAdapter offers an efficient way to produce a personalized Transformer model with less than 0.5% parameters added for each user.

Cite

CITATION STYLE

APA

Zhong, W., Tang, D., Wang, J., Yin, J., & Duan, N. (2021). UserAdapter: Few-Shot User Learning in Sentiment Analysis. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 1484–1488). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.129

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free