Learning Dynamic Contextualised Word Embeddings via Template-based Temporal Adaptation

2Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Dynamic contextualised word embeddings (DCWEs) represent the temporal semantic variations of words. We propose a method for learning DCWEs by time-adapting a pretrained Masked Language Model (MLM) using time-sensitive templates. Given two snapshots C1 and C2 of a corpus taken respectively at two distinct timestamps T1 and T2, we first propose an unsupervised method to select (a) pivot terms related to both C1 and C2, and (b) anchor terms that are associated with a specific pivot term in each individual snapshot. We then generate prompts by filling manually compiled templates using the extracted pivot and anchor terms. Moreover, we propose an automatic method to learn time-sensitive templates from C1 and C2, without requiring any human supervision. Next, we use the generated prompts to adapt a pretrained MLM to T2 by fine-tuning using those prompts. Multiple experiments show that our proposed method reduces the perplexity of test sentences in C2, outperforming the current state-of-the-art.

Cite

CITATION STYLE

APA

Tang, X., Zhou, Y., & Bollegala, D. (2023). Learning Dynamic Contextualised Word Embeddings via Template-based Temporal Adaptation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 9352–9369). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.520

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free