Convolutional Neural Networks Do Work with Pre-Defined Filters

5Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a novel class of Convolutional Neural Networks called Pre-defined Filter Convolutional Neural Networks (PFCNNs), where all n× n convolution kernels with n > 1 are pre-defined and constant during training. It involves a special form of depthwise convolution operation called a Pre-defined Filter Module (PFM). In the channel-wise convolution part, the 1× n× n kernels are drawn from a fixed pool of only a few (16) different pre-defined kernels. In the 1× 1 convolution part linear combinations of the pre-defined filter outputs are learned. Despite this harsh restriction, complex and discriminative features are learned. These findings provide a novel perspective on the way how information is processed within deep CNNs. We discuss various properties of PFCNNs and prove their effectiveness using the popular datasets Caltech101, CIFAR10, CUB-200-2011, FGVC-Aircraft, Flowers102, and Stanford Cars. Our implementation of PFCNNs is provided on Github https://github.com/Criscraft/PredefinedFilterNetworks.

Cite

CITATION STYLE

APA

Linse, C., Barth, E., & Martinetz, T. (2023). Convolutional Neural Networks Do Work with Pre-Defined Filters. In Proceedings of the International Joint Conference on Neural Networks (Vol. 2023-June). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/IJCNN54540.2023.10191449

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free