A novel ensemble of scale-invariant feature maps

4Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A novel method for improving the training of some topology preserving algorithms as the Scale Invariant Feature Map (SIM) and the Maximum Likelihood Hebbian Learning Scale Invariant Map (MAX-SIM) is presented and analyzed in this study. It is called Weighted Voting Superposition (WeVoS), providing two new versions, the WeVoS-SIM and theWeVoS-MAX-SIM. The method is based on the training of an ensemble of networks and the combination of them to obtain a single one, including the best features of each one of the networks in the ensemble. To accomplish this combination, a weighted voting process takes place between the units of the maps in the ensemble in order to determine the characteristics of the units of the resulting map. For comparison purposes these new models are compared with their original models, the SIM and MAX-SIM. The models are tested under the frame of an artificial data set. Three quality measures have been applied for each model in order to present a complete study of their capabilities. The results obtained confirm that the novel models presented in this study based on the application of WeVoS can outperform the classic models in terms of organization of the presented information.

Cite

CITATION STYLE

APA

Baruque, B., & Corchado, E. (2009). A novel ensemble of scale-invariant feature maps. Advances in Intelligent and Soft Computing, 57, 265–273. https://doi.org/10.1007/978-3-540-93905-4_32

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free