An adaptive hybrid and cluster-based model for speeding up the k-NN classifier

4Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A well known classification method is the k-Nearest Neighbors (k-NN) classifier. However, sequentially searching for the nearest neighbors in large datasets downgrades its performance because of the high computational cost involved. This paper proposes a cluster-based classification model for speeding up the k-NN classifier. The model aims to reduce the cost as much as possible and to maintain the classification accuracy at a high level. It consists of a simple data structure and a hybrid, adaptive algorithm that accesses this structure. Initially, a preprocessing clustering procedure builds the data structure. Then, the proposed algorithm, based on user-defined acceptance criteria, attempts to classify an incoming item using the nearest cluster centroids. Upon failure, the incoming item is classified by searching for the k nearest neighbors within specific clusters. The proposed approach was tested on five real life datasets. The results show that it can be used either to achieve a high accuracy with gains in cost or to reduce the cost at a minimum level with slightly lower accuracy. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Ougiaroglou, S., Evangelidis, G., & Dervos, D. A. (2012). An adaptive hybrid and cluster-based model for speeding up the k-NN classifier. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7209 LNAI, pp. 163–175). https://doi.org/10.1007/978-3-642-28931-6_16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free