Optimization approach for feature selection and classification with support vector machine

0Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The support vector machine (SVM) is a most popular tool to resolve the issues related to classification. It prepares a classifier by resolving an optimization problem to make a decision which instances of the training data set are support vectors. Feature selection is also important for selecting the optimum features. Data mining performance gets reduced by Irrelevant and redundant features. Feature selection used to choose a small quantity of related attributes to achieve good classification routine than applying all the attributes. Two major purposes are improving the classification functionalities and reducing the number of features. Moreover, the existing subset selection algorithms consider the work as a particular purpose issue. Selecting attributes are made out by the combination of attribute evaluator and search method using the WEKA Machine Learning Tool. In the proposed work, the SVM classification algorithm is applied by the classifier subset evaluator to automatically separate the standard information set.

Cite

CITATION STYLE

APA

Chidambaram, S., & Srinivasagan, K. G. (2016). Optimization approach for feature selection and classification with support vector machine. In Advances in Intelligent Systems and Computing (Vol. 410, pp. 103–111). Springer Verlag. https://doi.org/10.1007/978-81-322-2734-2_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free