Effects of data count and image scaling on Deep Learning training

16Citations
Citations of this article
48Readers
Mendeley users who have this article in their library.

Abstract

Background. Deep learning using convolutional neural networks (CNN) has achieved significant results in various fields that use images. Deep learning can automatically extract features from data, and CNN extracts image features by convolution processing. We assumed that increasing the image size using interpolation methods would result in an effective feature extraction. To investigate how interpolation methods change as the number of data increases, we examined and compared the effectiveness of data augmentation by inversion or rotation with image augmentation by interpolation when the image data for training were small. Further, we clarified whether image augmentation by interpolation was useful for CNN training. To examine the usefulness of interpolation methods in medical images, we used a Gender01 data set, which is a sex classification data set, on chest radiographs. For comparison of image enlargement using an interpolation method with data augmentation by inversion and rotation, we examined the results of two- and four-fold enlargement using a Bilinear method.

Cite

CITATION STYLE

APA

Hirahara, D., Takaya, E., Takahara, T., & Ueda, T. (2020). Effects of data count and image scaling on Deep Learning training. PeerJ Computer Science, 6, 1–13. https://doi.org/10.7717/peerj-cs.312

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free