Improvement of heterogeneous transfer learning efficiency by using Hebbian learning principle

16Citations
Citations of this article
34Readers
Mendeley users who have this article in their library.

Abstract

Transfer learning algorithms have been widely studied for machine learning in recent times. In particular, in image recognition and classification tasks, transfer learning has shown significant benefits, and is getting plenty of attention in the research community. While performing a transfer of knowledge among source and target tasks, homogeneous dataset is not always available, and heterogeneous dataset can be chosen in certain circumstances. In this article, we propose a way of improving transfer learning efficiency, in case of a heterogeneous source and target, by using the Hebbian learning principle, called Hebbian transfer learning (HTL). In computer vision, biologically motivated approaches such as Hebbian learning represent associative learning, where simultaneous activation of brain cells positively affect the increase in synaptic connection strength between the individual cells. The discriminative nature of learning for the search of features in the task of image classification fits well to the techniques, such as the Hebbian learning rule-neurons that fire together wire together. The deep learning models, such as convolutional neural networks (CNN), are widely used for image classification. In transfer learning, for such models, the connection weights of the learned model should adapt to new target dataset with minimum effort. The discriminative learning rule, such as Hebbian learning, can improve performance of learning by quickly adapting to discriminate between different classes defined by target task. We apply the Hebbian principle as synaptic plasticity in transfer learning for classification of images using a heterogeneous source-target dataset, and compare results with the standard transfer learning case. Experimental results using CIFAR-10 (Canadian Institute for Advanced Research) and CIFAR-100 datasets with various combinations show that the proposed HTL algorithm can improve the performance of transfer learning, especially in the case of a heterogeneous source and target dataset.

References Powered by Scopus

Deep residual learning for image recognition

174329Citations
N/AReaders
Get full text

Deep learning

63550Citations
N/AReaders
Get full text

Distinctive image features from scale-invariant keypoints

49980Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Railway Fastener Fault Diagnosis Based on Generative Adversarial Network and Residual Network Model

40Citations
N/AReaders
Get full text

A scoping review of neurodegenerative manifestations in explainable digital phenotyping

23Citations
N/AReaders
Get full text

Machine learning approach for modeling and control of a commercial heliocentris fc50 pem fuel cell system

17Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Magotra, A., & Kim, J. (2020). Improvement of heterogeneous transfer learning efficiency by using Hebbian learning principle. Applied Sciences (Switzerland), 10(16). https://doi.org/10.3390/app10165631

Readers over time

‘20‘21‘22‘23‘24‘250481216

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 10

67%

Lecturer / Post doc 2

13%

Researcher 2

13%

Professor / Associate Prof. 1

7%

Readers' Discipline

Tooltip

Computer Science 8

44%

Engineering 7

39%

Neuroscience 2

11%

Nursing and Health Professions 1

6%

Article Metrics

Tooltip
Mentions
Blog Mentions: 1

Save time finding and organizing research with Mendeley

Sign up for free
0