Co-supervised Pre-training of Pocket and Ligand

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Can we inject the pocket-ligand complementarity knowledge into the pre-trained model and jointly learn their chemical space? Pre-training molecules and proteins have attracted considerable attention in recent years, while most of these approaches focus on learning one of the chemical spaces and lack the consideration of their complementarity. We propose a co-supervised pre-training (CoSP) framework to learn 3D pocket and ligand representations simultaneously. We use a gated geometric message passing layer to model 3D pockets and ligands, where each node’s chemical features, geometric position, and direction are considered. To learn meaningful biological embeddings, we inject the pocket-ligand complementarity into the pre-training model via ChemInfoNCE loss, cooperating with a chemical similarity-enhanced negative sampling strategy to improve the representation learning. Through extensive experiments, we conclude that CoSP can achieve competitive results in pocket matching, molecule property prediction, and virtual screening.

Cite

CITATION STYLE

APA

Gao, Z., Tan, C., Xia, J., & Li, S. Z. (2023). Co-supervised Pre-training of Pocket and Ligand. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 14169 LNAI, pp. 405–421). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-43412-9_24

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free