Tailoring Instructions to Student's Learning Levels Boosts Knowledge Distillation

6Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.
Get full text

Abstract

It has been commonly observed that a teacher model with superior performance does not necessarily result in a stronger student, highlighting a discrepancy between current teacher training practices and effective knowledge transfer. In order to enhance the guidance of the teacher training process, we introduce the concept of distillation influence to determine the impact of distillation from each training sample on the student's generalization ability. In this paper, we propose Learning Good Teacher Matters (LGTM), an efficient training technique for incorporating distillation influence into the teacher's learning process. By prioritizing samples that are likely to enhance the student's generalization ability, our LGTM outperforms 10 common knowledge distillation baselines on 6 text classification tasks in the GLUE benchmark.

Cite

CITATION STYLE

APA

Ren, Y., Zhong, Z., Shi, X., Zhu, Y., Yuan, C., & Li, M. (2023). Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 1990–2006). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.111

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free