Discourse Information for Document-Level Temporal Dependency Parsing

1Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

In this study, we examine the benefits of incorporating discourse information into document-level temporal dependency parsing. Specifically, we evaluate the effectiveness of integrating both high-level discourse profiling information, which describes the discourse function of sentences, and surface-level sentence position information into temporal dependency graph (TDG) parsing. Unexpectedly, our results suggest that simple sentence position information, particularly when encoded using our novel sentence-position embedding method, performs the best, perhaps because it does not rely on noisy model-generated feature inputs. Our proposed system surpasses the current state-of-the-art TDG parsing systems in performance. Furthermore, we aim to broaden the discussion on the relationship between temporal dependency parsing and discourse analysis, given the substantial similarities shared between the two tasks. We argue that discourse analysis results should not be merely regarded as an additional input feature for temporal dependency parsing. Instead, adopting advanced discourse analysis techniques and research insights can lead to more effective and comprehensive approaches to temporal information extraction tasks.

Cite

CITATION STYLE

APA

Niu, J., Ng, V., Rees, E. E., de Montigny, S., & Penn, G. (2023). Discourse Information for Document-Level Temporal Dependency Parsing. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 82–88). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.codi-1.10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free