Development of convolutional neural networks for an electron-tracking Compton camera

11Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The Electron-Tracking Compton Camera (ETCC), which is a complete Compton camera that tracks Compton scattering electrons with a gas micro time projection chamber, is expected to open upMeVgamma-ray astronomy. The technical challenge for achieving several degrees of the point-spread function is precise determination of the electron recoil direction and the scattering position from track images. We attempted to reconstruct these parameters using convolutional neural networks. Two network models were designed to predict the recoil direction and the scattering position. These models marked 41° of angular resolution and 2.1mm of position resolution for 75 keV electron simulation data in argon-based gas at 2 atm pressure. In addition, the point-spread function of the ETCC was improved to 15° from 22° for experimental data from a 662 keV gamma-ray source. The performance greatly surpassed that using traditional analysis.

Cite

CITATION STYLE

APA

Ikeda, T., Takada, A., Abe, M., Yoshikawa, K., Tsuda, M., Ogio, S., … Tanimori, T. (2021). Development of convolutional neural networks for an electron-tracking Compton camera. Progress of Theoretical and Experimental Physics, 2021(8). https://doi.org/10.1093/ptep/ptab091

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free