The primary dilemmas in nonparametric algorithms like k-nearest neighbor classification are the largest computational and storage requirements. Moreover, the effectiveness of classification decreases due to uneven distribution of training data. In this paper, we present three approaches to minimize computation time and storage requirements. In order to achieve the goal, we present three approaches: fast k-NN, training set reduction techniques, and a hybrid of the previous two approaches. We have compared three approaches to existing methods and results show that the effectiveness (in terms of execution time and storage requirement) of the three algorithms are significantly better than existing algorithms.
CITATION STYLE
Prajapati, B. P., & Kathiriya, D. R. (2019). A hybrid machine learning technique for fusing fast k-NN and training set reduction: combining both improves the effectiveness of classification. In Advances in Intelligent Systems and Computing (Vol. 714, pp. 229–240). Springer Verlag. https://doi.org/10.1007/978-981-13-0224-4_21
Mendeley helps you to discover research relevant for your work.