Performance-Efficiency Comparisons of Channel Attention Modules for ResNets

3Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Attention modules can be added to neural network architectures to improve performance. This work presents an extensive comparison between several efficient attention modules for image classification and object detection, in addition to proposing a novel Attention Bias module with lower computational overhead. All measured attention modules have been efficiently re-implemented, which allows an objective comparison and evaluation of the relationship between accuracy and inference time. Our measurements show that single-image inference time increases far more (5–50%) than the increase in FLOPs suggests (0.2–3%) for a limited gain in accuracy, making computation cost an important selection criterion. Despite this increase in inference time, adding an attention module can outperform a deeper baseline ResNet in both speed and accuracy. Finally, we investigate the potential of adding attention modules to pretrained networks and show that fine-tuning is possible and superior to training from scratch. The choice of the best attention module strongly depends on the specific ResNet architecture, input resolution, batch size and inference framework.

References Powered by Scopus

Deep residual learning for image recognition

174204Citations
N/AReaders
Get full text

ImageNet: A Large-Scale Hierarchical Image Database

50864Citations
N/AReaders
Get full text

Squeeze-and-Excitation Networks

25934Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Accurate segmentation of COVID-19 infected regions in lung CT scans with deep learning

0Citations
N/AReaders
Get full text

Cardiac MRI Reconstruction with CMRatt: An Attention-Driven Approach

0Citations
N/AReaders
Get full text

ESF-YOLO: an accurate and universal object detector based on neural networks

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Klomp, S. R., Wijnhoven, R. G. J., & de With, P. H. N. (2023). Performance-Efficiency Comparisons of Channel Attention Modules for ResNets. Neural Processing Letters, 55(5), 6797–6813. https://doi.org/10.1007/s11063-023-11161-z

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 2

100%

Readers' Discipline

Tooltip

Medicine and Dentistry 1

25%

Biochemistry, Genetics and Molecular Bi... 1

25%

Engineering 1

25%

Computer Science 1

25%

Save time finding and organizing research with Mendeley

Sign up for free