Improving optimization of convolutional neural networks through parameter fine-tuning

45Citations
Citations of this article
81Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In recent years, convolutional neural networks have achieved state-of-the-art performance in a number of computer vision problems such as image classification. Prior research has shown that a transfer learning technique known as parameter fine-tuning wherein a network is pre-trained on a different dataset can boost the performance of these networks. However, the topic of identifying the best source dataset and learning strategy for a given target domain is largely unexplored. Thus, this research presents and evaluates various transfer learning methods for fine-grained image classification as well as the effect on ensemble networks. The results clearly demonstrate the effectiveness of parameter fine-tuning over random initialization. We find that training should not be reduced after transferring weights, larger, more similar networks tend to be the best source task, and parameter fine-tuning can often outperform randomly initialized ensembles. The experimental framework and findings will help to train models with improved accuracy.

References Powered by Scopus

Deep learning

64875Citations
N/AReaders
Get full text

ImageNet: A Large-Scale Hierarchical Image Database

52564Citations
N/AReaders
Get full text

Rich feature hierarchies for accurate object detection and semantic segmentation

26801Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Evaluation of Deep Learning CNN Model for Land Use Land Cover Classification and Crop Identification Using Hyperspectral Remote Sensing Images

127Citations
N/AReaders
Get full text

Comparison of pre-trained cnns for audio classification using transfer learning

86Citations
N/AReaders
Get full text

Fully convolutional networks for blueberry bruising and calyx segmentation using hyperspectral transmittance imaging

73Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Becherer, N., Pecarina, J., Nykl, S., & Hopkinson, K. (2019). Improving optimization of convolutional neural networks through parameter fine-tuning. Neural Computing and Applications, 31(8), 3469–3479. https://doi.org/10.1007/s00521-017-3285-0

Readers over time

‘17‘18‘19‘20‘21‘22‘23‘24‘2506121824

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 32

71%

Professor / Associate Prof. 5

11%

Lecturer / Post doc 4

9%

Researcher 4

9%

Readers' Discipline

Tooltip

Computer Science 29

60%

Engineering 17

35%

Neuroscience 1

2%

Sports and Recreations 1

2%

Save time finding and organizing research with Mendeley

Sign up for free
0