Multi-class multi-scale stacked sequential learning

5Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

One assumption in supervised learning is that data is independent and identically distributed. However, this assumption does not hold true in many real cases. Sequential learning is that discipline of machine learning that deals with dependent data. In this paper, we revise the Multi-Scale Sequential Learning approach (MSSL) for applying it in the multi-class case (MMSSL). We have introduced the ECOC framework in the MSSL base classifiers and a formulation for calculating confidence maps from the margins of the base classifiers. Another important contribution of this papers is the MMSSL compression approach for reducing the number of features in the extended data set. The proposed methods are tested on 5-class and 9-class image databases. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Puertas, E., Escalera, S., & Pujol, O. (2011). Multi-class multi-scale stacked sequential learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6713 LNCS, pp. 197–206). https://doi.org/10.1007/978-3-642-21557-5_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free