Kernel maximum a posteriori classification with error bound analysis

3Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Kernel methods have been widely used in data classification. Many kernel-based classifiers like Kernel Support Vector Machines (KSVM) assume that data can be separated by a hyperplane in the feature space. These methods do not consider the data distribution. This paper proposes a novel Kernel Maximum A Posteriori (KMAP) classification method, which implements a Gaussian density distribution assumption in the feature space and can be regarded as a more generalized classification method than other kernel-based classifier such as Kernel Fisher Discriminant Analysis (KFDA). We also adopt robust methods for parameter estimation. In addition, the error bound analysis for KMAP indicates the effectiveness of the Gaussian density assumption in the feature space. Furthermore, KMAP achieves very promising results on eight UCI benchmark data sets against the competitive methods. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Xu, Z., Huang, K., Zhu, J., King, I., & Lyu, M. R. (2008). Kernel maximum a posteriori classification with error bound analysis. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4984 LNCS, pp. 841–850). https://doi.org/10.1007/978-3-540-69158-7_87

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free