k-Medoids Clustering Based on Kernel Density Estimation and Jensen-Shannon Divergence

2Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Several conventional clustering methods consider the squared L2 -norm which is calculated from objects coordinates. To extract meaningful clusters from a set of massive objects, it is required to calculate the dissimilarity from both objects coordinates and other features such as objects distribution. In this paper, JS-divergence based k-medoids (JSKMdd) is proposed as a novel method for clustering network data. In the proposed method, the dissimilarity that is based on objects coordinates and an object distribution is considered. The effectiveness of the proposed method is verified through numerical experiments with artificial datasets which consist non-linear clusters. The influence of the parameter in the proposed method is also described.

Cite

CITATION STYLE

APA

Hamasuna, Y., Kingetsu, Y., & Nakano, S. (2019). k-Medoids Clustering Based on Kernel Density Estimation and Jensen-Shannon Divergence. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11676 LNAI, pp. 272–282). Springer Verlag. https://doi.org/10.1007/978-3-030-26773-5_24

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free