Geometric Transformer for End-to-End Molecule Properties Prediction

5Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

Abstract

Transformers have become methods of choice in many applications thanks to their ability to represent complex interactions between elements. However, extending the Transformer architecture to non-sequential data such as molecules and enabling its training on small datasets remains a challenge. In this work, we introduce a Transformer-based architecture for molecule property prediction, which is able to capture the geometry of the molecule. We modify the classical positional encoder by an initial encoding of the molecule geometry, as well as a learned gated self-attention mechanism. We further suggest an augmentation scheme for molecular data capable of avoiding the overfitting induced by the overparameterized architecture. The proposed framework outperforms the state-of-the-art methods while being based on pure machine learning solely, i.e. the method does not incorporate domain knowledge from quantum chemistry and does not use extended geometric inputs besides the pairwise atomic distances.

Cite

CITATION STYLE

APA

Choukroun, Y., & Wolf, L. (2022). Geometric Transformer for End-to-End Molecule Properties Prediction. In IJCAI International Joint Conference on Artificial Intelligence (pp. 2895–2901). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2022/401

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free