BERT-based Cohesion Analysis of Japanese Texts

10Citations
Citations of this article
69Readers
Mendeley users who have this article in their library.

Abstract

The meaning of natural language text is supported by cohesion among various kinds of entities, including coreference relations, predicate-argument structures, and bridging anaphora relations. However, predicate-argument structures for nominal predicates and bridging anaphora relations have not been studied well, and their analyses have been still very difficult. Recent advances in neural networks, in particular, self training-based language models including BERT (Devlin et al., 2019), have significantly improved many natural language processing tasks, making it possible to dive into the study on analysis of cohesion in the whole text. In this study, we tackle an integrated analysis of cohesion in Japanese texts. Our results significantly outperformed existing studies in each task, especially about 10 to 20 point improvement both for zero anaphora and coreference resolution. Furthermore, we also showed that coreference resolution is different in nature from the other tasks and should be treated specially.

Cite

CITATION STYLE

APA

Ueda, N., Kawahara, D., & Kurohashi, S. (2020). BERT-based Cohesion Analysis of Japanese Texts. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 1323–1333). Association for Computational Linguistics (ACL). https://doi.org/10.5715/jnlp.28.705

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free