Stochastic drop of kernel windows for improved generalization in convolution neural networks

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We propose a novel dropout technique for convolutional neural networks by redesigned Dropout and DropConnect methods. Conventional drop methods work on the individual single weight value of the fully connected network. When they are applied to convolution layers, only some kernel weights are removed. However, all the weights of the convolutional kernel windows together constitute a specific pattern, so dropping part of kernel window weights may cause change of the learned patterns and may model completely different local patterns. We assign the basic unit of drop method for convolutional weights to be the whole kernel windows, so one output map value is dropped. We evaluated the proposed DropKernel strategy by the object classification performance on CIFAR10 in comparison to conventional Dropout and DropConnect methods, and showed improved performance of the proposed method.

Cite

CITATION STYLE

APA

Lee, S., & Jang, G. J. (2019). Stochastic drop of kernel windows for improved generalization in convolution neural networks. In Advances in Intelligent Systems and Computing (Vol. 903, pp. 223–227). Springer Verlag. https://doi.org/10.1007/978-3-030-11051-2_34

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free