Parallel perceptions, activation margins and imbalanced training set pruning

9Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A natural way to deal with training samples in imbalanced class problems is to prune them removing redundant patterns, easy to classify and probably over represented, and label noisy patterns that belonging to one class are labelled as members of another. This allows classifier construction to focus on borderline patterns, likely to be the most informative ones. To appropriately define the above subsets, in this work we will use as base classifiers the so-called parallel perceptrons, a novel approach to committee machine training that allows, among other things, to naturally define margins for hidden unit activations. We shall use these margins to define the above pattern types and to iteratively perform subsample selections in an initial training set that enhance classification accuracy and allow for a balanced classifier performance even when class sizes are greatly different. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Cantador, I., & Dorronsoro, J. R. (2005). Parallel perceptions, activation margins and imbalanced training set pruning. In Lecture Notes in Computer Science (Vol. 3523, pp. 43–50). Springer Verlag. https://doi.org/10.1007/11492542_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free