Effect of detector lag on CT noise power spectra

18Citations
Citations of this article
36Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Purpose: The authors examined the effect of detector lag on the noise power spectrum (NPS) of CT images reconstructed with filtered backprojection (FBP). Methods: The authors derived an analytical expression of the NPS with detector lag, and then verified it using computer simulations with parallel beam and fan beam geometries. The dependence of the NPS on the amount of lag, location within the scanned field of view (FOV), and the number of views used in the reconstruction (samples per rotation) was investigated using constant and view dependent noise in the raw data. Results: Detector lag introduces noise correlation in the azimuthal direction. The effect on the NPS is a frequency dependent reduction in amplitude. In small regions of the image, the effect is primarily in the frequencies corresponding to the azimuthal direction. The noise blurring and NPS filtering increases with increasing radial distance, and therefore regions at larger radial distances have lower noise power. With the same detector lag response function, the amount of noise correlation and NPS filtering decreases with increasing number of views. Conclusions: The shape of the NPS depends on the detector lag coefficients, location of the region, and the number of views used in the reconstruction. In general, the noise correlation caused by detector lag decreased the amplitude of the NPS. © 2011 American Association of Physicists in Medicine.

Author supplied keywords

Cite

CITATION STYLE

APA

Baek, J., & Pelc, N. J. (2011). Effect of detector lag on CT noise power spectra. Medical Physics, 38(6), 2995–3005. https://doi.org/10.1118/1.3589135

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free