Maximum contrast classifiers

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Within the Bayesian setting of classification we present a method for classifier design based on constrained density modelling. The approach leads to maximization of some contrast function, which measures the discriminative power of the class-conditional densities used for classification. By an upper bound on the density contrast the sensitivity of the classifiers can be increased in regions with low density differences which are usually most important for discrimination. We introduce a parametrization of the contrast in terms of modified kernel density estimators with variable mixing weights. In practice the approach shows some favourable properties: first, for fixed hyperparameters, training of the resulting Maximum Contrast Classifier (MCC) is achieved by linear programming for optimization of the mixing weights. Second for a certain choice of the density contrast bound and the kernel bandwidth, the maximum contrast solutions lead to sparse representations of the classifiers with good generalization performance, similar to the maximum margin solutions of support vector machines. Third the method is readily furnished for the general multi-class problem since training proceeds in the same way as in the binary case. © Springer-Verlag Berlin Heidelberg 2002.

Cite

CITATION STYLE

APA

Meinicke, P., Twellmann, T., & Ritter, H. (2002). Maximum contrast classifiers. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2415 LNCS, pp. 745–750). Springer Verlag. https://doi.org/10.1007/3-540-46084-5_121

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free