PREPRINT
40D5C8C4-F517-48D0-944E-4C4E59D046A0

Pixelated Reconstruction of Gravitational Lenses using Recurrent Inference Machines

Alexandre Adam, Laurence Perreault-Levasseur, Yashar Hezaveh
arXiv:2207.01073

Submitted on 3 July 2022

Abstract

Modeling strong gravitational lenses in order to quantify the distortions in the images of background sources and to reconstruct the mass density in the foreground lenses has traditionally been a difficult computational challenge. As the quality of gravitational lens images increases, the task of fully exploiting the information they contain becomes computationally and algorithmically more difficult. In this work, we use a neural network based on the Recurrent Inference Machine (RIM) to simultaneously reconstruct an undistorted image of the background source and the lens mass density distribution as pixelated maps. The method we present iteratively reconstructs the model parameters (the source and density map pixels) by learning the process of optimization of their likelihood given the data using the physical model (a ray-tracing simulation), regularized by a prior implicitly learned by the neural network through its training data. When compared to more traditional parametric models, the proposed method is significantly more expressive and can reconstruct complex mass distributions, which we demonstrate by using realistic lensing galaxies taken from the cosmological hydrodynamic simulation IllustrisTNG.

Preprint

Comment: 4+10 pages, 4+5 figures, accepted at the ICML 2022 Workshop on Machine Learning for Astrophysics

Subjects: Astrophysics - Instrumentation and Methods for Astrophysics; Astrophysics - Cosmology and Nongalactic Astrophysics

URL: https://arxiv.org/abs/2207.01073