Abstract
Magnetic resonance imaging is a leading image modality for many clinical applications; however, a significant drawback is the lengthy data acquisition. This motivates the development of methods for reconstruction of sparsely sampled image data. One such technique is the Variational Network (VN), a machine learning method that generalizes traditional iterative reconstruction techniques, learning the regularization term from large amounts of image data. Previously, with the VN technique, reconstruction of 4-fold accelerated knee images was shown to be highly successful. In this work we extend the VN approach to applications beyond knee imaging and evaluate the classic VN and a newly developed Unet-VN in 5 different anatomical regions. We evaluate the networks trained individually for each anatomical area as well as jointly trained with data from all anatomical areas. The VN and Unet-VN were trained to reconstruct 4-fold accelerated images of knees, brains, hips, ankles and shoulders. SSIM was calculated to quantitatively evaluate the reconstructed images. Results show that the Unet-VN outperforms the classic VN, both quantitatively – in terms of structural similarity – and qualitatively. The networks jointly trained with multi-anatomy data approach the performance of the individually trained networks and offer the simplicity of a single network for a range of clinical applications which has substantial benefit for clinical translation.
Author supplied keywords
Cite
CITATION STYLE
Johnson, P. M., Muckley, M. J., Bruno, M., Kobler, E., Hammernik, K., Pock, T., & Knoll, F. (2019). Joint Multi-anatomy Training of a Variational Network for Reconstruction of Accelerated Magnetic Resonance Image Acquisitions. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11905 LNCS, pp. 71–79). Springer. https://doi.org/10.1007/978-3-030-33843-5_7
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.