Severing the edge between before and after: Neural architectures for temporal ordering of events

21Citations
Citations of this article
99Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we propose a neural architecture and a set of training methods for ordering events by predicting temporal relations. Our proposed models receive a pair of events within a span of text as input and they identify temporal relations (Before, After, Equal, Vague) between them. Given that a key challenge with this task is the scarcity of annotated data, our models rely on either pretrained representations (i.e. RoBERTa, BERT or ELMo), transfer and multi-task learning (by leveraging complementary datasets), and self-training techniques. Experiments on the MATRES dataset of English documents establish a new state-of-the-art on this task.

Cite

CITATION STYLE

APA

Ballesteros, M., Anubhai, R., Wang, S., Pourdamghani, N., Vyas, Y., Ma, J., … Al-Onaizan, Y. (2020). Severing the edge between before and after: Neural architectures for temporal ordering of events. In EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference (pp. 5412–5417). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.emnlp-main.436

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free