CKDST: Comprehensively and Effectively Distill Knowledge from Machine Translation to End-to-End Speech Translation

6Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Distilling knowledge from a high-resource task, e.g., machine translation, is an effective way to alleviate the data scarcity problem of end-to-end speech translation. However, previous works simply use the classical knowledge distillation that does not allow for adequate transfer of knowledge from machine translation. In this paper, we propose a comprehensive knowledge distillation framework for speech translation, CKDST, which is capable of comprehensively and effectively distilling knowledge from machine translation to speech translation from two perspectives: cross-modal contrastive representation distillation and simultaneous decoupled knowledge distillation. In the former, we leverage a contrastive learning objective to optimize the mutual information between speech and text representations for representation distillation in the encoder. In the later, we decouple the non-target class knowledge from target class knowledge for logits distillation in the decoder. Experiments on the MuST-C benchmark dataset demonstrate that our CKDST substantially improves the baseline by 1.2 BLEU on average in all translation directions, and outperforms previous state-ofthe-art end-to-end and cascaded speech translation models. The source code is available at https://github.com/ethanyklei/CKDST.

Cite

CITATION STYLE

APA

Lei, Y., Xue, Z., Sun, H., Zhao, X., Zhu, S., Lin, X., & Xiong, D. (2023). CKDST: Comprehensively and Effectively Distill Knowledge from Machine Translation to End-to-End Speech Translation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 3123–3137). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-acl.195

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free