Detection of gravitational-wave signals from binary neutron star mergers using machine learning

41Citations
Citations of this article
31Readers
Mendeley users who have this article in their library.

Abstract

As two neutron stars merge, they emit gravitational waves that can potentially be detected by Earth-bound detectors. Matched-filtering-based algorithms have traditionally been used to extract quiet signals embedded in noise. We introduce a novel neural-network-based machine learning algorithm that uses time series strain data from gravitational-wave detectors to detect signals from nonspinning binary neutron star mergers. For the Advanced LIGO design sensitivity, our network has an average sensitive distance of 130 Mpc at a false-alarm rate of ten per month. Compared to other state-of-the-art machine learning algorithms, we find an improvement by a factor of 4 in sensitivity to signals with a signal-to-noise ratio between 8 and 15. However, this approach is not yet competitive with traditional matched-filtering-based methods. A conservative estimate indicates that our algorithm introduces on average 10.2 s of latency between signal arrival and generating an alert. We give an exact description of our testing procedure, which can be applied not only to machine-learning-based algorithms but all other search algorithms as well. We thereby improve the ability to compare machine learning and classical searches.

Cite

CITATION STYLE

APA

Schäfer, M. B., Ohme, F., & Nitz, A. H. (2020). Detection of gravitational-wave signals from binary neutron star mergers using machine learning. Physical Review D, 102(6). https://doi.org/10.1103/PhysRevD.102.063015

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free