In contrast to conventional analog screen-film mammography new flat detectors have a high dynamic range and a linear characteristic curve. Hence, the radiographic technique can be optimized independently of the receptor exposure. It can be exclusively focused on the improvement of the image quality and the reduction of the patient dose. In this paper we measure the image quality by a physical quantity, the signal difference-to-noise ratio (SDNR), and the patient risk by the average glandular dose (AGD). Using these quantities, we compare the following different setups through simulations and phantom studies regarding the detection of microcalcifications and tumors for different breast thicknesses and breast compositions: Monochromatic radiation, three different anode/filter combinations: Molybdenum/molybdenum (MoMo), molybdenum/rhodium (MoRh), and tungsten/rhodium (WRh), different filter thicknesses, use of anti-scatter grids, and different tube voltages. For a digital mammography system based on an amorphous selenium detector it turned out that, first, the WRh combination is the best choice for all detection tasks studied. Second, monochromatic radiation can further reduce the AGD by a factor of up to 2.3, maintaining the image quality in comparison with a real polychromatic spectrum of an x-ray tube. And, third, the use of an anti-scatter grid is only advantageous for breast thicknesses larger than approximately 5 cm. © 2006 American Association of Physicists in Medicine.
CITATION STYLE
Bernhardt, P., Mertelmeier, T., & Hoheisel, M. (2006). X-ray spectrum optimization of full-field digital mammography: Simulation and phantom study. Medical Physics, 33(11), 4337–4349. https://doi.org/10.1118/1.2351951
Mendeley helps you to discover research relevant for your work.