Towards Efficient Convolutional Neural Networks Through Low-Error Filter Saliency Estimation

N/ACitations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Filter saliency based channel pruning is a state-of-the-art method for deep convolutional neural network compression and acceleration. This channel pruning method ranks the importance of individual filter by estimating its impact of each filter’s removal on the training loss, and then remove the least important filters and fine-tune the remnant network. In this work, we propose a systematic channel pruning method that significantly reduces the estimation error of filter saliency. Different from existing approaches, our method largely reduces the magnitude of parameters in a network by introducing alternating direction method of multipliers (ADMM) into the pre-training procedure. Therefore, the estimation of filter saliency based on Taylor expansion is significantly improved. Extensive experiments with various benchmark network architectures and datasets demonstrate that the proposed method has a much improved unimportant filter selection capability and outperform state-of-the-art channel pruning method.

Cite

CITATION STYLE

APA

Wang, Z., Li, C., Wang, X., & Wang, D. (2019). Towards Efficient Convolutional Neural Networks Through Low-Error Filter Saliency Estimation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11671 LNAI, pp. 255–267). Springer Verlag. https://doi.org/10.1007/978-3-030-29911-8_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free