Adversarial Robustness of Prompt-based Few-Shot Learning for Natural Language Understanding

4Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

Abstract

State-of-the-art few-shot learning (FSL) methods leverage prompt-based fine-tuning to obtain remarkable results for natural language understanding (NLU) tasks. While much of the prior FSL methods focus on improving downstream task performance, there is a limited understanding of the adversarial robustness of such methods. In this work, we conduct an extensive study of several state-of-the-art FSL methods to assess their robustness to adversarial perturbations. To better understand the impact of various factors towards robustness (or the lack of it), we evaluate prompt-based FSL methods against fully fine-tuned models for aspects such as the use of unlabeled data, multiple prompts, number of few-shot examples, model size and type. Our results on six GLUE tasks indicate that compared to fully fine-tuned models, vanilla FSL methods lead to a notable relative drop in task performance (i.e., are less robust) in the face of adversarial perturbations. However, using (i) unlabeled data for prompt-based FSL and (ii) multiple prompts flip the trend. We further demonstrate that increasing the number of few-shot examples and model size lead to increased adversarial robustness of vanilla FSL methods. Broadly, our work sheds light on the adversarial robustness evaluation of prompt-based FSL methods for NLU tasks.

Cite

CITATION STYLE

APA

Prabhakara, V., Nookala, S., Verma, G., Mukherjee, S., & Kumar, S. (2023). Adversarial Robustness of Prompt-based Few-Shot Learning for Natural Language Understanding. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 2196–2208). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-acl.138

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free