MorphoActivation: Generalizing ReLU Activation Function by Mathematical Morphology

2Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper analyses both nonlinear activation functions and spatial max-pooling for Deep Convolutional Neural Networks (DCNNs) by means of the algebraic basis of mathematical morphology. Additionally, a general family of activation functions is proposed by considering both max-pooling and nonlinear operators in the context of morphological representations. Experimental section validates the goodness of our approach on classical benchmarks for supervised learning by DCNN.

Cite

CITATION STYLE

APA

Velasco-Forero, S., & Angulo, J. (2022). MorphoActivation: Generalizing ReLU Activation Function by Mathematical Morphology. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13493 LNCS, pp. 449–461). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-19897-7_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free