Fine-grained Temporal Relation Extraction with Ordered-Neuron LSTM and Graph Convolutional Networks

5Citations
Citations of this article
52Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Fine-grained temporal relation extraction (FineTempRel) aims to recognize the durations and timeline of event mentions in text. A missing part in the current deep learning models for FineTempRel is their failure to exploit the syntactic structures of the input sentences to enrich the representation vectors. In this work, we propose to fill this gap by introducing novel methods to integrate the syntactic structures into the deep learning models for FineTempRel. The proposed model focuses on two types of syntactic information from the dependency trees, i.e., the syntax-based importance scores for representation learning of the words and the syntactic connections to identify important context words for the event mentions. We also present two novel techniques to facilitate the knowledge transfer between the subtasks of FineTempRel, leading to a novel model with the state-of-the-art performance for this task.

Cite

CITATION STYLE

APA

Tran, M. P., Van Nguyen, M., & Nguyen, T. H. (2021). Fine-grained Temporal Relation Extraction with Ordered-Neuron LSTM and Graph Convolutional Networks. In W-NUT 2021 - 7th Workshop on Noisy User-Generated Text, Proceedings of the Conference (pp. 35–45). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.wnut-1.5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free