Optimizing Medical Image Classification Models for Edge Devices

3Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Machine learning algorithms for medical diagnostics often require resource-intensive environments to run, such as expensive cloud servers or high-end GPUs, making these models impractical for use in the field. We investigate the use of model quantization and GPU-acceleration for chest X-ray classification on edge devices. We employ 3 types of quantization (dynamic range, float-16, and full int8) which we tested on models trained on the Chest-XRay14 Dataset. We achieved a 2–4x reduction in model size, offset by small decreases in the mean AUC-ROC score of 0.0%–0.9%. On ARM architectures, integer quantization was shown to improve inference latency by up to 57%. However, we also observe significant increases in latency on x86 processors. GPU acceleration also improved inference latency, but this was outweighed by kernel launch overhead. We show that optimization of diagnostic models has the potential to expand their utility to day-to-day devices used by patients and healthcare workers; however, these improvements are context- and architecture-dependent and should be tested on the relevant devices before deployment in low-resource environments.

Cite

CITATION STYLE

APA

Abid, A., Sinha, P., Harpale, A., Gichoya, J., & Purkayastha, S. (2022). Optimizing Medical Image Classification Models for Edge Devices. In Lecture Notes in Networks and Systems (Vol. 327 LNNS, pp. 77–87). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-86261-9_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free