A comparative analysis of one-class structural risk minimization by support vector machines and nearest neighbor rule

2Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

One-class classification is an important problem with applications in several different areas such as outlier detection and machine monitoring. In this paper we propose a novel method for one-class classification, referred to as kernel k-NNDDSRM. This is a modification of an earlier algorithm, the kNNDDSRM, which aims to make the method able to build more flexible descriptions with the use of the kernel trick. This modification does not affect the algorithm's main feature which is the significant reduction in the number of stored prototypes in comparison to NNDD. Aiming to assess the results, we carried out experiments with synthetic and real data to compare the method with the support vector data description (SVDD) method. The experimental results show that our oneclass classification approach outperformed SVDD in terms of the area under the receiver operating characteristic (ROC) curve in six out of eight data sets. The results also show that the kernel kNNDDSRM remarkably outperformed kNNDDSRM. © 008 International Federation for Information Processing.

Cite

CITATION STYLE

APA

Cabral, G. G., & Oliveira, A. L. I. (2008). A comparative analysis of one-class structural risk minimization by support vector machines and nearest neighbor rule. In IFIP International Federation for Information Processing (Vol. 276, pp. 245–254). https://doi.org/10.1007/978-0-387-09695-7_24

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free