Fine-grained entity typing (FET) is a fundamental task for various entity-leveraging applications. Although great success has been made, existing systems still have challenges in handling noisy samples in training data introduced by distant supervision method. To address these noises, previous studies either focus on processing the clean samples (i.e., have only one label) and noisy samples (i.e., have multiple labels) with different strategies or filtering the noisy labels based on the assumption that the distantly-supervised label set certainly contains the correct type label. In this paper, we propose a probabilistic automatic relabeling method which treats all training samples uniformly. Our method aims to estimate the pseudo-truth label distribution of each sample, and the pseudo-truth distribution will be treated as part of trainable parameters which are jointly updated during the training process. The proposed approach does not rely on any prerequisite or extra supervision, making it effective on real applications. Experiments on several benchmarks show that our method outperforms previous competitive approaches and indeed alleviates the noisy labeling problem.
CITATION STYLE
Zhang, H., Long, D., Xu, G., Zhu, M., Xie, P., Huang, F., & Wang, J. (2020). Learning with noise: Improving distantly-supervised fine-grained entity typing via automatic relabeling. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 3808–3815). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/527
Mendeley helps you to discover research relevant for your work.