Interpolation-based detection of lumbar vertebrae in CT spine images

2Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Detection of an object of interest can be represented as an optimization problem that can be solved by brute force or heuristic algorithms. However, the globally optimal solution may not represent the optimal detection result, which can be especially observed in the case of vertebra detection, where neighboring vertebrae are of similar appearance and shape. An adequate optimizer has to therefore consider not only the global optimum but also local optima that represent candidate locations for each vertebra. In this paper, we describe a novel framework for automated spine and vertebra detection in three-dimensional (3D) images of the lumbar spine, where we apply a novel optimization technique based on interpolation theory to detect the location of the whole spine in the 3D image and to detect the location of individual vertebrae within the spinal column. The performance of the proposed framework was evaluated on 10 computed tomography (CT) images of the lumbar spine. The resulting mean symmetric absolute surface distance of 1.25}0.41mm and Dice coefficient of 83.67}4.44%, computed from the final vertebra detection results against corresponding reference vertebra segmentations, indicate that the proposed framework can successfully detected vertebrae in CT images of the lumbar spine

Cite

CITATION STYLE

APA

Ibragimov, B., Korez, R., Likar, B., Pernuš, F., & Vrtovec, T. (2015). Interpolation-based detection of lumbar vertebrae in CT spine images. Lecture Notes in Computational Vision and Biomechanics, 20, 73–84. https://doi.org/10.1007/978-3-319-14148-0_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free