The paper considers applying a boosting strategy to optimise the generalisation bound obtained recently by Shawe-Taylor and Cristianini [7] in terms of the two norm of the slack variables. The formulation performs gradient descent over the quadratic loss function which is insensitive to points with a large margin. A novel feature of this algorithm is a principled adaptation of the size of the target margin. Experiments with text and UCI data shows that the new algorithm improves the accuracy of boosting. DMarginBoost generally achieves significant improvements over Adaboost.
CITATION STYLE
Lodhi, H., Karakoulas, G., & Shawe-Taylor, J. (2000). Boosting the margin distribution. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1983, pp. 54–59). Springer Verlag. https://doi.org/10.1007/3-540-44491-2_9
Mendeley helps you to discover research relevant for your work.