Zero- and Few-Shot Event Detection via Prompt-Based Meta Learning

15Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With emerging online topics as a source for numerous new events, detecting unseen/rare event types presents an elusive challenge for existing event detection methods, where only limited data access is provided for training. To address the data scarcity problem in event detection, we propose MetaEvent, a meta learning-based framework for zero- and few-shot event detection. Specifically, we sample training tasks from existing event types and perform meta training to search for optimal parameters that quickly adapt to unseen tasks. In our framework, we propose to use the clozebased prompt and a trigger-aware soft verbalizer to efficiently project output to unseen event types. Moreover, we design a contrastive meta objective based on maximum mean discrepancy (MMD) to learn class-separating features. As such, the proposed MetaEvent can perform zero-shot event detection by mapping features to event types without any prior knowledge. In our experiments, we demonstrate the effectiveness of MetaEvent in both zero-shot and few-shot scenarios, where the proposed method achieves state-of-the-art performance in extensive experiments on benchmark datasets FewEvent and MAVEN.

Cite

CITATION STYLE

APA

Yue, Z., Zeng, H., Lan, M., Ji, H., & Wang, D. (2023). Zero- and Few-Shot Event Detection via Prompt-Based Meta Learning. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 7928–7943). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.440

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free