SSCLNet: A Self-Supervised Contrastive Loss-Based Pre-Trained Network for Brain MRI Classification

6Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Brain magnetic resonance images (MRI) convey vital information for making diagnostic decisions and are widely used to detect brain tumors. This research proposes a self-supervised pre-training method based on feature representation learning through contrastive loss applied to unlabeled data. Self-supervised learning aims to understand vital features using the raw input, which is helpful since labeled data is scarce and expensive. For the contrastive loss-based pre-training, data augmentation is applied to the dataset, and positive and negative instance pairs are fed into a deep learning model for feature learning. Subsequently, the features are passed through a neural network model to maximize similarity and contrastive learning of the instances. This pre-trained model serves as an encoder for supervised training and then the classification of MRI images. Our results show that self-supervised pre-training with contrastive loss performs better than random or ImageNet initialization. We also show that contrastive learning performs better when the diversity of images in the pre-training dataset is more. We have taken three differently sized ResNet models as the base models. Further, experiments were also conducted to study the effect of changing the augmentation types for generating positive and negative samples for self-supervised training.

Cite

CITATION STYLE

APA

Mishra, A., Jha, R., & Bhattacharjee, V. (2023). SSCLNet: A Self-Supervised Contrastive Loss-Based Pre-Trained Network for Brain MRI Classification. IEEE Access, 11, 6673–6681. https://doi.org/10.1109/ACCESS.2023.3237542

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free