Scalar neural network algorithms are limited in their ability to understand scale, rotational, or affine transformations within images and resort to average or max-pooling techniques which result in translational invariance. In an attempt to overcome these limitations, Hinton et al. introduced vectorized capsule network frameworks which support equivariance while capturing spatial relationships between data points, thus enhancing predictive capabilities of networks. However, experimenting with activation functions, hyperparameters, and optimizers have proven faster convergence and orthogonalizing weights within the layers of capsules enhance performance by slashing associated average error rates.
CITATION STYLE
Kundu, S., & Gagana, B. (2020). Orthogonalizing Weights in Capsule Network Architecture. In Lecture Notes in Networks and Systems (Vol. 93, pp. 77–84). Springer. https://doi.org/10.1007/978-981-15-0630-7_8
Mendeley helps you to discover research relevant for your work.