Generative Zero-Shot Prompt Learning for Cross-Domain Slot Filling with Inverse Prompting

15Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Zero-shot cross-domain slot filling aims to transfer knowledge from the labeled source domain to the unlabeled target domain. Existing models either encode slot descriptions and examples or design handcrafted question templates using heuristic rules, suffering from poor generalization capability or robustness. In this paper, we propose a generative zero-shot prompt learning framework for cross-domain slot filling, both improving generalization and robustness than previous work. Besides, we introduce a novel inverse prompting strategy to distinguish different slot types to avoid the multiple prediction problem, and an efficient prompt tuning strategy to boost higher performance by only training fewer prompt parameters. Experiments and analysis demonstrate the effectiveness of our proposed framework, especially huge improvements (+13.44% F1) on the unseen slots.

Cite

CITATION STYLE

APA

Li, X., Wang, L., Dong, G., He, K., Zhao, J., Lei, H., … Xu, W. (2023). Generative Zero-Shot Prompt Learning for Cross-Domain Slot Filling with Inverse Prompting. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 825–834). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-acl.52

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free