Adaptive model selection can be defined as the process thanks to which an optimal classifiers h * is automatically selected from a function class H by using only a given set of examples z. Such a process is particularly critic when the number of examples in z is low, because it is impossible the classical splitting of z in training + test + validation. In this work we show that the joined investigation of two bounds of the prediction error of the classifier can be useful to select h* by using z for both model selection and training. Our learning algorithm is a simple kernel-based Perceptron that can be easily implemented in a counter-based digital hardware. Experiments on two real world data sets show the validity of the proposed method. © Springer-Verlag Berlin Heidelberg 2002.
CITATION STYLE
Boni, A. (2002). Adaptive model selection for digital linear classifiers. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2415 LNCS, pp. 1333–1338). Springer Verlag. https://doi.org/10.1007/3-540-46084-5_215
Mendeley helps you to discover research relevant for your work.