Abstract
Knowledge Distillation (KD) is extensively used to compress and deploy large pre-trained language models on edge devices for real-world applications. However, one neglected area of research is the impact of noisy (corrupted) labels on KD. We present, to the best of our knowledge, the first study on KD with noisy labels in Natural Language Understanding (NLU). We document the scope of the problem and present two methods to mitigate the impact of label noise. Experiments on the GLUE benchmark show that our methods are effective even under high noise levels. Nevertheless, our results indicate that more research is necessary to cope with label noise under the KD.
Cite
CITATION STYLE
Bhardwaj, S., Ghaddar, A., Rashid, A., Bibi, K., Li, C., Ghodsi, A., … Rezagholizadeh, M. (2021). Knowledge Distillation with Noisy Labels for Natural Language Understanding. In W-NUT 2021 - 7th Workshop on Noisy User-Generated Text, Proceedings of the Conference (pp. 297–303). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.wnut-1.33
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.