Estimates made in the 1970's indicated that a supernova occurring within tens of parsecs of Earth could have significant effects on the ozone layer. Since that time, improved tools for detailed modeling of atmospheric chemistry have been developed to calculate ozone depletion, and advances have been made in theoretical modeling of supernovae and of the resultant gamma-ray spectra. In addition, one now has better knowledge of the occurrence rate of supernovae in the galaxy, and of the spatial distribution of progenitors to core-collapse supernovae. We report here the results of two-dimensional atmospheric model calculations that take as input the spectral energy distribution of a supernova, adopting various distances from Earth and various latitude impact angles. In separate simulations we calculate the ozone depletion due to both gamma-rays and cosmic rays. We find that for the combined ozone depletion roughly to double the ``biologically active'' UV flux received at the surface of the Earth, the supernova must occur at <8 pc. Based on the latest data, the time-averaged galactic rate of core-collapse supernovae occurring within 8 pc is ~1.5/Gyr. In comparing our calculated ozone depletions with those of previous studies, we find them to be significantly less severe than found by Ruderman (1974), and consistent with Whitten et al. (1976). In summary, given the amplitude of the effect, the rate of nearby supernovae, and the ~Gyr time scale for multicellular organisms on Earth, this particular pathway for mass extinctions may be less important than previously thought.
CITATION STYLE
Gehrels, N., Laird, C. M., Jackman, C. H., Cannizzo, J. K., Mattson, B. J., & Chen, W. (2003). Ozone Depletion from Nearby Supernovae. The Astrophysical Journal, 585(2), 1169–1176. https://doi.org/10.1086/346127
Mendeley helps you to discover research relevant for your work.