Orthogonalizing Weights in Capsule Network Architecture

1Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Scalar neural network algorithms are limited in their ability to understand scale, rotational, or affine transformations within images and resort to average or max-pooling techniques which result in translational invariance. In an attempt to overcome these limitations, Hinton et al. introduced vectorized capsule network frameworks which support equivariance while capturing spatial relationships between data points, thus enhancing predictive capabilities of networks. However, experimenting with activation functions, hyperparameters, and optimizers have proven faster convergence and orthogonalizing weights within the layers of capsules enhance performance by slashing associated average error rates.

Author supplied keywords

Cite

CITATION STYLE

APA

Kundu, S., & Gagana, B. (2020). Orthogonalizing Weights in Capsule Network Architecture. In Lecture Notes in Networks and Systems (Vol. 93, pp. 77–84). Springer. https://doi.org/10.1007/978-981-15-0630-7_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free