Conditional Generative Adversarial Networks Aided Motion Correction of Dynamic 18F-FDG PET Brain Studies

26Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.

Abstract

This work set out to develop a motion-correction approach aided by conditional generative adversarial network (cGAN) methodology that allows reliable, data-driven determination of involuntary subject motion during dynamic 18F-FDG brain studies. Methods: Ten healthy volunteers (5 men/5 women; mean age 6 SD, 27 6 7 y; weight, 70 6 10 kg) underwent a test–retest 18F-FDG PET/MRI examination of the brain (n 5 20). The imaging protocol consisted of a 60-min PET listmode acquisition contemporaneously acquired with MRI, including MR navigators and a 3-dimensional time-of-flight MR angiography sequence. Arterial blood samples were collected as a reference standard representing the arterial input function (AIF). Training of the cGAN was performed using 70% of the total datasets (n 5 16, randomly chosen), which was corrected for motion using MR navigators. The resulting cGAN mappings (between individual frames and the reference frame [55–60 min after injection]) were then applied to the test dataset (remaining 30%, n 5 6), producing artificially generated low-noise images from early high-noise PET frames. These low-noise images were then coregistered to the reference frame, yielding 3-dimen-sional motion vectors. Performance of cGAN-aided motion correction was assessed by comparing the image-derived input function (IDIF) extracted from a cGAN-aided motion-corrected dynamic sequence with the AIF based on the areas under the curves (AUCs). Moreover, clinical relevance was assessed through direct comparison of the average cerebral metabolic rates of glucose (CMRGlc) values in gray matter calculated using the AIF and the IDIF. Results: The absolute percentage difference between AUCs derived using the motion-corrected IDIF and the AIF was 1.2% 1 0.9%. The gray matter CMRGlc values determined using these 2 input functions differed by less than 5% (2.4% 1 1.7%). Conclusion: A fully automated data-driven motion-compensation approach was established and tested for 18F-FDG PET brain imaging. cGAN-aided motion correction enables the translation of noninvasive clinical absolute quantification from PET/MR to PET/CT by allowing the accurate determination of motion vectors from the PET data itself.

Cite

CITATION STYLE

APA

Sundar, L. K. S., Iommi, D., Muzik, O., Chalampalakis, Z., Klebermass, E. M., Hienert, M., … Beyer, T. (2021). Conditional Generative Adversarial Networks Aided Motion Correction of Dynamic 18F-FDG PET Brain Studies. Journal of Nuclear Medicine, 62(6), 871–879. https://doi.org/10.2967/jnumed.120.248856

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free