Event temporal relation extraction with attention mechanism and graph neural network

36Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.

Abstract

Event temporal relation extraction is an important part of natural language processing. Many models are being used in this task with the development of deep learning. However, most of the existing methods cannot accurately obtain the degree of association between different tokens and events, and event-related information cannot be effectively integrated. In this paper, we propose an event information integration model that integrates event information through multilayer bidirectional long short-term memory (Bi-LSTM) and attention mechanism. Although the above scheme can improve the extraction performance, it can still be further optimized. To further improve the performance of the previous scheme, we propose a novel relational graph attention network that incorporates edge attributes. In this approach, we first build a semantic dependency graph through dependency parsing, model a semantic graph that considers the edges' attributes by using top-k attention mechanisms to learn hidden semantic contextual representations, and finally predict event temporal relations. We evaluate proposed models on the TimeBank-Dense dataset. Compared to previous baselines, the Micro-F1 scores obtained by our models improve by 3.9% and 14.5%, respectively.

Cite

CITATION STYLE

APA

Xu, X., Gao, T., Wang, Y., & Xuan, X. (2022). Event temporal relation extraction with attention mechanism and graph neural network. Tsinghua Science and Technology, 27(1), 79–90. https://doi.org/10.26599/TST.2020.9010063

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free