Neighborhood random classification

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Ensemble methods (EMs) have become increasingly popular in data mining because of their efficiency. These methods generate a set of classifiers using one or several machine learning algorithms (MLAs) and aggregate them into a single classifier (Meta-Classifier, MC). Decision trees (DT), SVM and k-Nearest Neighbors (kNN) are among the most well-known used in the context of EMs. Here, we propose an approach based on neighborhood graphs as an alternative. Thanks to these related graphs, like relative neighborhood graphs (RNGs), Gabriel graphs (GGs) or Minimum Spanning Tree (MST), we provide a generalized approach to the kNN approach with less arbitrary parameters such as the value of k. Neighborhood graphs have never been introduced into EM approaches before. The results of our algorithm : Neighborhood Random Classification are very promising as they are equal to the best EM approaches such as Random Forest or those based on SVMs. In this preliminary and experimental work, we provide the methodological approach and many comparative results. We also provide some results on the influence of neighborhood structure regarding the efficiency of the classifier and draw some issues that deserves to be studied. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Zighed, D. A., Ezzeddine, D., & Rico, F. (2013). Neighborhood random classification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8085 LNCS, pp. 767–774). https://doi.org/10.1007/978-3-642-40020-9_86

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free