Improving Event Representation via Simultaneous Weakly Supervised Contrastive Learning and Clustering

17Citations
Citations of this article
52Readers
Mendeley users who have this article in their library.

Abstract

Representations of events described in text are important for various tasks. In this work, we present SWCC: a Simultaneous Weakly supervised Contrastive learning and Clustering framework for event representation learning. SWCC learns event representations by making better use of co-occurrence information of events. Specifically, we introduce a weakly supervised contrastive learning method that allows us to consider multiple positives and multiple negatives, and a prototype-based clustering method that avoids semantically related events being pulled apart. For model training, SWCC learns representations by simultaneously performing weakly supervised contrastive learning and prototype-based clustering. Experimental results show that SWCC outperforms other baselines on Hard Similarity and Transitive Sentence Similarity tasks. In addition, a thorough analysis of the prototype-based clustering method demonstrates that the learned prototype vectors are able to implicitly capture various relations between events. Our code will be available at https://github.com/gaojun4ever/SWCC4Event.

Cite

CITATION STYLE

APA

Gao, J., Wang, W., Yu, C., Zhao, H., Ng, W., & Xu, R. (2022). Improving Event Representation via Simultaneous Weakly Supervised Contrastive Learning and Clustering. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 3036–3049). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.216

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free