Extracting event temporal relations is an important task for natural language understanding. Many works have been proposed for supervised event temporal relation extraction, which typically requires a large amount of human-annotated data for model training. However, the data annotation for this task is very time-consuming and challenging. To this end, we study the problem of semi-supervised event temporal relation extraction. Self-training as a widely used semi-supervised learning method can be utilized for this problem. However, it suffers from the noisy pseudo-labeling problem. In this paper, we propose the use of uncertainty-aware self-training framework (UAST) to quantify the model uncertainty for coping with pseudo-labeling errors. Specifically, UAST utilizes (1) Uncertainty Estimation module to compute the model uncertainty for pseudo-labeling unlabeled data; (2) Sample Selection with Exploration module to select informative samples based on uncertainty estimates; and (3) Uncertainty-Aware Learning module to explicitly incorporate the model uncertainty into the self-training process. Experimental results indicate that our approach significantly outperforms previous state-of-the-art methods.
CITATION STYLE
Cao, P., Zuo, X., Chen, Y., Liu, K., Zhao, J., & Bi, W. (2021). Uncertainty-Aware Self-Training for Semi-Supervised Event Temporal Relation Extraction. In International Conference on Information and Knowledge Management, Proceedings (pp. 2900–2904). Association for Computing Machinery. https://doi.org/10.1145/3459637.3482207
Mendeley helps you to discover research relevant for your work.