Event causality recognition exploiting multiple annotators' judgments and background knowledge

35Citations
Citations of this article
96Readers
Mendeley users who have this article in their library.

Abstract

We propose new BERT-based methods for recognizing event causality such as “smoke cigarettes” ! “die of lung cancer” written in web texts. In our methods, we grasp each annotator's policy by training multiple classifiers, each of which predicts the labels given by a single annotator, and combine the resulting classifiers' outputs to predict the final labels determined by majority vote. Furthermore, we investigate the effect of supplying background knowledge to our classifiers. Since BERT models are pre-trained with a large corpus, some sort of background knowledge for event causality may be learned during pre-training. Our experiments with a Japanese dataset suggest that this is actually the case: Performance improved when we pre-trained the BERT models with web texts containing a large number of event causalities instead of Wikipedia articles or randomly sampled web texts. However, this effect was limited. Therefore, we further improved performance by simply adding texts related to an input causality candidate as background knowledge to the input of the BERT models. We believe these findings indicate a promising future research direction.

Cite

CITATION STYLE

APA

Kadowaki, K., Iida, R., Torisawa, K., Oh, J. H., & Kloetzer, J. (2019). Event causality recognition exploiting multiple annotators’ judgments and background knowledge. In EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 5816–5822). Association for Computational Linguistics. https://doi.org/10.18653/v1/d19-1590

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free