Exploring the GDB-13 chemical space using deep generative models

111Citations
Citations of this article
176Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Recent applications of recurrent neural networks (RNN) enable training models that sample the chemical space. In this study we train RNN with molecular string representations (SMILES) with a subset of the enumerated database GDB-13 (975 million molecules). We show that a model trained with 1 million structures (0.1% of the database) reproduces 68.9% of the entire database after training, when sampling 2 billion molecules. We also developed a method to assess the quality of the training process using negative log-likelihood plots. Furthermore, we use a mathematical model based on the "coupon collector problem" that compares the trained model to an upper bound and thus we are able to quantify how much it has learned. We also suggest that this method can be used as a tool to benchmark the learning capabilities of any molecular generative model architecture. Additionally, an analysis of the generated chemical space was performed, which shows that, mostly due to the syntax of SMILES, complex molecules with many rings and heteroatoms are more difficult to sample.

Cite

CITATION STYLE

APA

Arús-Pous, J., Blaschke, T., Ulander, S., Reymond, J. L., Chen, H., & Engkvist, O. (2019). Exploring the GDB-13 chemical space using deep generative models. Journal of Cheminformatics, 11(1). https://doi.org/10.1186/s13321-019-0341-z

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free