AMLN: Adversarial-Based Mutual Learning Network for Online Knowledge Distillation

9Citations
Citations of this article
56Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Online knowledge distillation has attracted increasing interest recently, which jointly learns teacher and student models or an ensemble of student models simultaneously and collaboratively. On the other hand, existing works focus more on outcome-driven learning according to knowledge like classification probabilities whereas the distilling processes which capture rich and useful intermediate features and information are largely neglected. In this work, we propose an innovative adversarial-based mutual learning network (AMLN) that introduces process-driven learning beyond outcome-driven learning for augmented online knowledge distillation. A block-wise training module is designed which guides the information flow and mutual learning among peer networks adversarially throughout different learning stages, and this spreads until the final network layer which captures more high-level information. AMLN has been evaluated under a variety of network architectures over three widely used benchmark datasets. Extensive experiments show that AMLN achieves superior performance consistently against state-of-the-art knowledge transfer methods.

Cite

CITATION STYLE

APA

Zhang, X., Lu, S., Gong, H., Luo, Z., & Liu, M. (2020). AMLN: Adversarial-Based Mutual Learning Network for Online Knowledge Distillation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12357 LNCS, pp. 158–173). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-58610-2_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free