Naive bayesian classifier based on neighborhood probability

2Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

When calculating the class-conditional probability of continuous attributes with naive Bayesian classifier (NBC) algorithm, the existing methods usually make use of the superposition of many normal distribution probability density functions to fit the true probability density function. Accordingly, the value of the class-conditional probability is equal to the sum of values of normal distribution probability density functions. In this paper, we propose a NPNBC model, i.e. the naive Bayesian classifier based on the neighborhood probability. In NPNBC, when calculating the class-conditional probability for a continuous attribute value in the given unknown example, a small neighborhood is created for the continuous attribute value in every normal distribution probability density function. So, the neighborhood probabilities for each normal distribution probability density function can be obtained. The sum of these neighborhood probabilities is the class-conditional probability for the continuous attribute value in NPNBC. Our experimental results demonstrate that NPNBC can obtain the remarkable performance in classification accuracy when compared with the normal method and the kernel method. In addition, we also investigate the relationship between the classification accuracy of NPNBC and the value of neighborhood. © 2012 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Liu, J. N. K., He, Y., Wang, X., & Hu, Y. (2012). Naive bayesian classifier based on neighborhood probability. In Communications in Computer and Information Science (Vol. 299 CCIS, pp. 112–121). https://doi.org/10.1007/978-3-642-31718-7_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free