Bedrock radioactivity influences the rate and spectrum of mutation

7Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

Abstract

All organisms on Earth are exposed to low doses of natural radioactivity but some habitats are more radioactive than others. Yet, documenting the influence of natural radioactivity on the evolution of biodiversity is challenging. Here, we addressed whether organisms living in naturally more radioactive habitats accumulate more mutations across generations using 14 species of waterlice living in subterranean habitats with contrasted levels of radioactivity. We found that the mitochondrial and nuclear mutation rates across a waterlouse species’ genome increased on average by 60% and 30%, respectively, when radioactivity increased by a factor of three. We also found a positive correlation between the level of radioactivity and the probability of G to T (and complementary C to A) mutations, a hallmark of oxidative stress. We conclude that even low doses of natural bedrock radioactivity influence the mutation rate possibly through the accumulation of oxidative damage, in particular in the mitochondrial genome.

Cite

CITATION STYLE

APA

Saclier, N., Chardon, P., Malard, F., Konecny-Dupré, L., Eme, D., Bellec, A., … Douady, C. J. (2020). Bedrock radioactivity influences the rate and spectrum of mutation. ELife, 9, 1–20. https://doi.org/10.7554/eLife.56830

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free