Training of deep learning algorithms such as CNN, LSTM, or GRU often requires large amount of data. However in real world applications, the amount of data especially labelled data is limited. To address this challenge, we study Deep Transfer Learning (DTL) in the context of Multitasking Learning (MTL) to extract sharable knowledge from tasks and use it for related tasks. In this paper, we use Minimum Closed Ball (MEB) as a flexible knowledge representation method to map shared domain knowledge from primary task to secondary task in multitasking learning. The experiments provide both analytic and empirical results to show the effectiveness and robustness of the proposed MEB-based deep transfer learning.
CITATION STYLE
Deng, Z., Liu, F., Zhao, J., Wei, Q., Pang, S., & Leng, Y. (2018). Deep transfer learning via minimum enclosing balls. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11303 LNCS, pp. 198–207). Springer Verlag. https://doi.org/10.1007/978-3-030-04182-3_18
Mendeley helps you to discover research relevant for your work.