Neural networks and other sophisticated machine learning algorithms frequently miss simple solutions that can be discovered by a more constrained learning methods. Transition from a single neuron solving linearly separable problems, to multithreshold neuron solving k-separable problems, to neurons implementing prototypes solving q-separable problems, is investigated. Using Learning Vector Quantization (LVQ) approach this transition is presented as going from two prototypes defining a single hyperplane, to many co-linear prototypes defining parallel hyperplanes, to unconstrained prototypes defining Voronoi tessellation. For most datasets relaxing the co-linearity condition improves accuracy increasing complexity of the model, but for data with inherent logical structure LVQ algorithms with constraints significantly outperforms original LVQ and many other algorithms. © 2009 Springer Berlin Heidelberg.
CITATION STYLE
Grochowski, M., & Duch, W. (2009). Constrained learning vector quantization or relaxed k-separability. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5768 LNCS, pp. 151–160). https://doi.org/10.1007/978-3-642-04274-4_16
Mendeley helps you to discover research relevant for your work.