Adaboosting neural networks: Application to on-line character recognition

38Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.
Get full text

Abstract

“Boosting” is a general method for improving the performance of any weak learning algorithm that consistently generates classifiers which need to perform only slightly better than random guessing. A recently proposed and very promising boosting algorithm is AdaBoost [4]. It has been applied with great success to several benchmark machine learning problems using rather simple learning algorithms [3], in particular decision trees [1,2,5]. In this paper we use AdaBoost to improve the performances of a strong learning algorithm: a neural network based on-line character recognition system. In particular we will show that it can be used to learn automatically a great variety of writing styles even when the amount of training data for each style varies a lot. Our system achieves about 1.4% error on a handwritten digit data base of more than 200 writers.

Cite

CITATION STYLE

APA

Schwenk, H., & Bengio, Y. (1997). Adaboosting neural networks: Application to on-line character recognition. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1327, pp. 967–972). Springer Verlag. https://doi.org/10.1007/bfb0020278

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free