Distilling Inter-Class Distance for Semantic Segmentation

13Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Knowledge distillation is widely adopted in semantic segmentation to reduce the computation cost. The previous knowledge distillation methods for semantic segmentation focus on pixel-wise feature alignment and intra-class feature variation distillation, neglecting to transfer the knowledge of the inter-class distance in the feature space, which is important for semantic segmentation. To address this issue, we propose an Inter-class Distance Distillation (IDD) method to transfer the inter-class distance in the feature space from the teacher network to the student network. Furthermore, semantic segmentation is a position-dependent task, thus we exploit a position information distillation module to help the student network encode more position information. Extensive experiments on three popular datasets: Cityscapes, Pascal VOC and ADE20K show that our method is helpful to improve the accuracy of semantic segmentation models and achieves the state-of-the-art performance. E.g. it boosts the benchmark model (“PSPNet+ResNet18”) by 7.50% in accuracy on the Cityscapes dataset.

Cite

CITATION STYLE

APA

Zhang, Z., Zhou, C., & Tu, Z. (2022). Distilling Inter-Class Distance for Semantic Segmentation. In IJCAI International Joint Conference on Artificial Intelligence (pp. 1686–1692). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2022/235

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free