Kernel difference-weighted k-nearest neighbors classification

3Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Nearest Neighbor (NN) rule is one of the simplest and most important methods in pattern recognition. In this paper, we propose a kernel difference-weighted k-nearest neighbor method (KDF-WKNN) for pattern classification. The proposed method defines the weighted KNN rule as a constrained optimization problem, and then we propose an efficient solution to compute the weights of different nearest neighbors. Unlike distance-weighted KNN which assigns different weights to the nearest neighbors according to the distance to the unclassified sample, KDF-WKNN weights the nearest neighbors by using both the norm and correlation of the differences between the unclassified sample and its nearest neighbors. Our experimental results indicate that KDF-WKNN is better than the original KNN and distanceweighted KNN, and is comparable to some state-of-the-art methods in terms of classification accuracy. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Zuo, W., Wang, K., Zhang, H., & Zhang, D. (2007). Kernel difference-weighted k-nearest neighbors classification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4682 LNAI, pp. 861–870). Springer Verlag. https://doi.org/10.1007/978-3-540-74205-0_89

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free