Synthetic aperture radar tomography (TomoSAR) has been extensively employed in 3-D reconstruction in dense urban areas using high-resolution SAR acquisitions. Compressive sensing (CS)-based algorithms are generally considered as the state-of-the art in super-resolving TomoSAR, in particular in the single look case. This superior performance comes at the cost of extra computational burdens, because of the sparse reconstruction, which cannot be solved analytically, and we need to employ computationally expensive iterative solvers. In this article, we propose a novel deep learning-based super-resolving TomoSAR inversion approach, boldsymbol {gamma } -Net, to tackle this challenge. boldsymbol {gamma } -Net adopts advanced complex-valued learned iterative shrinkage thresholding algorithm (CV-LISTA) to mimic the iterative optimization step in sparse reconstruction. Simulations show the height estimate from a well-trained boldsymbol {gamma } -Net approaches the Cramér-Rao lower bound (CRLB) while improving the computational efficiency by one to two orders of magnitude comparing to the first-order CS-based methods. It also shows no degradation in the super-resolution power comparing to the state-of-the-art second-order TomoSAR solvers, which are much more computationally expensive than the first-order methods. Specifically, boldsymbol {gamma } -Net reaches more than 90% detection rate in moderate super-resolving cases at 25 measurements at 6 dB SNR. Moreover, simulation at limited baselines demonstrates that the proposed algorithm outperforms the second-order CS-based method by a fair margin. Test on real TanDEM-X data with just six interferograms also shows high-quality 3-D reconstruction with high-density detected double scatterers.
CITATION STYLE
Qian, K., Wang, Y., Shi, Y., & Zhu, X. X. (2022). γ-Net: Superresolving SAR Tomographic Inversion via Deep Learning. IEEE Transactions on Geoscience and Remote Sensing, 60. https://doi.org/10.1109/TGRS.2022.3164193
Mendeley helps you to discover research relevant for your work.