Max-plus operators applied to filter selection and model pruning in neural networks

10Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Following recent advances in morphological neural networks, we propose to study in more depth how Max-plus operators can be exploited to define morphological units and how they behave when incorporated in layers of conventional neural networks. Besides showing that they can be easily implemented with modern machine learning frameworks, we confirm and extend the observation that a Max-plus layer can be used to select important filters and reduce redundancy in its previous layer, without incurring performance loss. Experimental results demonstrate that the filter selection strategy enabled by a Max-plus layer is highly efficient and robust, through which we successfully performed model pruning on two neural network architectures. We also point out that there is a close connection between Maxout networks and our pruned Max-plus networks by comparing their respective characteristics. The code for reproducing our experiments is available online (for code release, please visit https://github.com/yunxiangzhang. ).

Cite

CITATION STYLE

APA

Zhang, Y., Blusseau, S., Velasco-Forero, S., Bloch, I., & Angulo, J. (2019). Max-plus operators applied to filter selection and model pruning in neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11564 LNCS, pp. 310–322). Springer Verlag. https://doi.org/10.1007/978-3-030-20867-7_24

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free