Pixelated Reconstruction of Foreground Density and Background Surface Brightness in Gravitational Lensing Systems Using Recurrent Inference Machines

  • Adam A
  • Perreault-Levasseur L
  • Hezaveh Y
  • et al.
2Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Modeling strong gravitational lenses in order to quantify distortions in the images of background sources and to reconstruct the mass density in foreground lenses has been a difficult computational challenge. As the quality of gravitational lens images increases, the task of fully exploiting the information they contain becomes computationally and algorithmically more difficult. In this work, we use a neural network based on the recurrent inference machine to reconstruct simultaneously an undistorted image of the background source and the lens mass density distribution as pixelated maps. The method iteratively reconstructs the model parameters (the image of the source and a pixelated density map) by learning the process of optimizing the likelihood given the data using the physical model (a ray-tracing simulation), regularized by a prior implicitly learned by the neural network through its training data. When compared to more traditional parametric models, the proposed method is significantly more expressive and can reconstruct complex mass distributions, which we demonstrate by using realistic lensing galaxies taken from the IllustrisTNG cosmological hydrodynamic simulation.

Cite

CITATION STYLE

APA

Adam, A., Perreault-Levasseur, L., Hezaveh, Y., & Welling, M. (2023). Pixelated Reconstruction of Foreground Density and Background Surface Brightness in Gravitational Lensing Systems Using Recurrent Inference Machines. The Astrophysical Journal, 951(1), 6. https://doi.org/10.3847/1538-4357/accf84

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free