Multi-source transfer learning with multi-view adaboost

57Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Transfer learning, which is one of the most important research directions in machine learning, has been studied in various fields in recent years. In this paper, we combine the theories of multi-source and multi-view learning into transfer learning and propose a new algorithm named Multi-source Transfer Learning with Multi-view Adaboost (MsTL-MvAdaboost). Different from many previous works on transfer learning, in this algorithm, we not only use the labeled data from several source tasks to help learn one target task, but also consider how to transfer them in different views synchronously. We regard all the source and target tasks as a collection of several constituent views and each of these tasks can be learned from different views. Experimental results also validate the effectiveness of our proposed approach. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Xu, Z., & Sun, S. (2012). Multi-source transfer learning with multi-view adaboost. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7665 LNCS, pp. 332–339). https://doi.org/10.1007/978-3-642-34487-9_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free