Shift, Rotation and Scale Invariant Signatures for Two-Dimensional Contours, in a Neural Network Architecture

  • Squire D
  • Caelli T
N/ACitations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A technique for obtaining shift, rotation and scale invariant signatures for two dimensional contours is proposed and demonstrated. An \emph{invariance factor} is calculated at each point by comparing the orientation of the tangent vector with vector fields corresponding to the generators of Lie transformation groups for shift, rotation and scaling. The statistics of these invariance factors over the contour are used to produce an \emph{invariance signature}. This operation is implemented in a Model-Based Neural Network (MBNN), in which the architecture and weights are parameterised by the constraints of the problem domain. The end result after constructing and training this system is the same as a traditional neural network: a collection of layers of nodes with weighted connections. The design and modeling process can be thought of as \emph{compiling} an invariant classifier into a neural network. We contend that these invariance signatures, whilst not unique, are sufficient to characterise contours for many pattern recognition tasks.

Cite

CITATION STYLE

APA

Squire, D. McG., & Caelli, T. M. (1997). Shift, Rotation and Scale Invariant Signatures for Two-Dimensional Contours, in a Neural Network Architecture (pp. 344–348). https://doi.org/10.1007/978-1-4615-6099-9_60

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free