Deep transfer learning via minimum enclosing balls

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Training of deep learning algorithms such as CNN, LSTM, or GRU often requires large amount of data. However in real world applications, the amount of data especially labelled data is limited. To address this challenge, we study Deep Transfer Learning (DTL) in the context of Multitasking Learning (MTL) to extract sharable knowledge from tasks and use it for related tasks. In this paper, we use Minimum Closed Ball (MEB) as a flexible knowledge representation method to map shared domain knowledge from primary task to secondary task in multitasking learning. The experiments provide both analytic and empirical results to show the effectiveness and robustness of the proposed MEB-based deep transfer learning.

Cite

CITATION STYLE

APA

Deng, Z., Liu, F., Zhao, J., Wei, Q., Pang, S., & Leng, Y. (2018). Deep transfer learning via minimum enclosing balls. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11303 LNCS, pp. 198–207). Springer Verlag. https://doi.org/10.1007/978-3-030-04182-3_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free