On Conditional and Compositional Language Model Differentiable Prompting

0Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Prompts have been shown to be an effective method to adapt a frozen Pretrained Language Model (PLM) to perform well on downstream tasks. Prompts can be represented by a human-engineered word sequence or by a learned continuous embedding. In this work, we investigate conditional and compositional differentiable prompting. We propose a new model, Prompt Production System (PROPS), which learns to transform task instructions or input metadata, into continuous prompts that elicit task-specific outputs from the PLM. Our model uses a modular network structure based on our neural formulation of Production Systems, which allows the model to learn discrete rules - neural functions that learn to specialize in transforming particular prompt input patterns, making it suitable for compositional transfer learning and few-shot learning. We present extensive empirical and theoretical analysis and show that PROPS consistently surpasses other PLM adaptation techniques, and often improves upon fully fine-tuned models, on compositional generalization tasks, controllable summarization and multilingual translation, while needing fewer trainable parameters.

Cite

CITATION STYLE

APA

Pilault, J., Liu, C., Bansal, M., & Dreyer, M. (2023). On Conditional and Compositional Language Model Differentiable Prompting. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2023-August, pp. 4136–4144). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2023/460

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free