Teaching Where to See: Knowledge Distillation-Based Attentive Information Transfer in Vehicle Maker Classification

4Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Deep neural networks (DNNs) have been applied to various fields and achieved high performances. However, they require significant computing resources because of their numerous parameters, even though some of those parameters are redundant and do not contribute to the DNN performance. Recently, to address this problem, many knowledge distillation-based methods have been proposed to compress a large DNN model into a small model. In this paper, we propose a novel knowledge distillation method that can compress a vehicle maker classification system based on a cascaded convolutional neural network (CNN) into a single CNN structure. The system uses mask regions with CNN features (Mask R-CNN) as a preprocessor for the vehicle region detection and has a structure to be used in conjunction with a CNN classifier. By the preprocessor, the classifier can receive the background-removed vehicle image, which allows the classifier to have more attention to the vehicle region. With this cascaded structure, the system can classify the vehicle makers at about 91% performance. Most of all, when we compress the system into a single CNN structure through the proposed knowledge distillation method, it demonstrates about 89% accuracy, in which only about 2% of the accuracy is lost. Our experimental results show that the proposed method is superior to the conventional knowledge distillation method in terms of performance transfer.

Cite

CITATION STYLE

APA

Lee, Y., Ahn, N., Heo, J. H., Jo, S. Y., & Kang, S. J. (2019). Teaching Where to See: Knowledge Distillation-Based Attentive Information Transfer in Vehicle Maker Classification. IEEE Access, 7, 86412–86420. https://doi.org/10.1109/ACCESS.2019.2925198

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free