Model compression with two-stage multi-teacher knowledge distillation for web question answering system

86Citations
Citations of this article
121Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep pre-training and fine-tuning models (such as BERT and OpenAI GPT) have demonstrated excellent results in question answering areas. However, due to the sheer amount of model parameters, the inference speed of these models is very slow. How to apply these complex models to real business scenarios becomes a challenging but practical problem. Previous model compression methods usually suffer from information loss during the model compression procedure, leading to inferior models compared with the original one. To tackle this challenge, we propose a Two-stage Multi-teacher Knowledge Distillation (TMKD for short) method for web Question Answering system. We first develop a general Q&A distillation task for student model pre-training, and further fine-tune this pre-trained student model with multi-teacher knowledge distillation on downstream tasks (like Web Q&A task, MNLI, SNLI, RTE tasks from GLUE), which effectively reduces the overfitting bias in individual teacher models, and transfers more general knowledge to the student model. The experiment results show that our method can significantly outperform the baseline methods and even achieve comparable results with the original teacher models, along with substantial speedup of model inference.

Cite

CITATION STYLE

APA

Yang, Z., Shou, L., Gong, M., Lin, W., & Jiang, D. (2020). Model compression with two-stage multi-teacher knowledge distillation for web question answering system. In WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining (pp. 690–698). Association for Computing Machinery, Inc. https://doi.org/10.1145/3336191.3371792

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free