Maximal Margin Estimation with Perceptron-Like Algorithm

7Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we propose and analyse a γ-margin generalisation of the perceptron learning algorithm of Rosenblatt. The difference between the original approach and the γ-margin approach is only in the update step. We consider the behaviour of such a modified algorithm in both separable and non-separable case and also when the γ-margin is negative. We give the convergence proof of such a modified algorithm, similar to the classical proof by Novikoff. Moreover we show how to change the margin of the update step in the progress of the algorithm to obtain the maximal possible margin of separation. In application part, we show the connection of the maximal margin of separation with SVM methods. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Korzeń, M., & Klȩsk, P. (2008). Maximal Margin Estimation with Perceptron-Like Algorithm. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5097, pp. 597–608). Springer Verlag. https://doi.org/10.1007/978-3-540-69731-2_58

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free