Microbial interactions from a new perspective: reinforcement learning reveals new insights into microbiome evolution

8Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Motivation: Microbes are essential part of all ecosystems, influencing material flow and shaping their surroundings. Metabolic modeling has been a useful tool and provided tremendous insights into microbial community metabolism. However, current methods based on flux balance analysis (FBA) usually fail to predict metabolic and regulatory strategies that lead to long-term survival and stability especially in heterogenous communities. Results: Here, we introduce a novel reinforcement learning algorithm, Self-Playing Microbes in Dynamic FBA, which treats microbial metabolism as a decision-making process, allowing individual microbial agents to evolve by learning and adapting metabolic strategies for enhanced long-term fitness. This algorithm predicts what microbial flux regulation policies will stabilize in the dynamic ecosystem of interest in the presence of other microbes with minimal reliance on predefined strategies. Throughout this article, we present several scenarios wherein our algorithm outperforms existing methods in reproducing outcomes, and we explore the biological significance of these predictions.

Cite

CITATION STYLE

APA

Ghadermazi, P., & Chan, S. H. J. (2024). Microbial interactions from a new perspective: reinforcement learning reveals new insights into microbiome evolution. Bioinformatics, 40(1). https://doi.org/10.1093/bioinformatics/btae003

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free