Dedicated CT imaging of the breast holds great promise for improving the detection and diagnosis of early stage breast cancer. However, before CT breast imaging (CTBI) can become a clinical reality, it is imperative that the issue of image degradation due to the rather large scattered radiation component be addressed. One approach to reducing scatter in CT breast imaging is with the use of anti-scatter grids. This paper describes a theoretical study analyzing the ideal linear observer signal-to-noise ratio (SNR) achieved with varying hypothetical grids. Results suggest that improvement in performance with anti-scatter grids can be attained if the primary x-ray transmission is high enough and if scatter transmission of the grid is low enough. It is shown that for detection of a 4 mm lesion in a dense breast (95% fibroglandular tissue), the AUC (area under the ROC curve) for a grid with primary transmission of 80%, and scatter-to-primary ratio of 5% would be 99%, whereas the AUC for the no grid case would be 88%. Thus, for this specific CTBI task, the ideal linear observer with a high performance grid would substantially outperform that with no grid. Similar improvement in performance with grids was observed for the detection of microcalcifications. © 2009 Springer-Verlag.
CITATION STYLE
Glick, S. J., & Didier, C. S. (2009). Can anti-scatter grids improve image quality in breast CT. In IFMBE Proceedings (Vol. 25, pp. 868–870). Springer Verlag. https://doi.org/10.1007/978-3-642-03879-2_243
Mendeley helps you to discover research relevant for your work.