A short note about the application of polynomial kernels with fractional degree in support vector learning

3Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In the mid 90's a fundamental new Machine Learning approach was developed by V. N. Vapnik: The Support Vector Machine (SVM). This new method can be regarded as a very promising approach and is getting more and more attention in the fields where neural networks and decision tree methods are applied. Whilst neural networks may be considered (correctly or not) to bewell understood and are in wide use, Support Vector Learning has some rough edges in theoretical details and its inherent numerical tasks prevent it from being easily applied in practice.This paper picks up a new aspect - the use of fractional degrees on polynomial kernels in theSVM - discovered in the course of an implementation of the algorithm. Fractional degrees on polynomial kernels broaden the capabilities of the SVM and offer the possibility to deal with feature spaces of infinite dimension. We introduce a method to simplify the quadratic programmingproblem, as the core of the SVM.

Cite

CITATION STYLE

APA

Rossius, R., Zenker, G., Ittner, A., & Dilger, W. (1998). A short note about the application of polynomial kernels with fractional degree in support vector learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1398, pp. 143–148). Springer Verlag. https://doi.org/10.1007/bfb0026684

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free