KPGT: Knowledge-Guided Pre-training of Graph Transformer for Molecular Property Prediction

59Citations
Citations of this article
55Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Designing accurate deep learning models for molecular property prediction plays an increasingly essential role in drug and material discovery. Recently, due to the scarcity of labeled molecules, self-supervised learning methods for learning generalizable and transferable representations of molecular graphs have attracted lots of attention. In this paper, we argue that there exist two major issues hindering current self-supervised learning methods from obtaining desired performance on molecular property prediction, that is, the ill-defined pre-training tasks and the limited model capacity. To this end, we introduce Knowledge-guided Pre-training of Graph Transformer (KPGT), a novel self-supervised learning framework for molecular graph representation learning, to alleviate the aforementioned issues and improve the performance on the downstream molecular property prediction tasks. More specifically, we first introduce a high-capacity model, named Line Graph Transformer (LiGhT), which emphasizes the importance of chemical bonds and is mainly designed to model the structural information of molecular graphs. Then, a knowledge-guided pre-training strategy is proposed to exploit the additional knowledge of molecules to guide the model to capture the abundant structural and semantic information from large-scale unlabeled molecular graphs. Extensive computational tests demonstrated that KPGT can offer superior performance over current state-of-the-art methods on several molecular property prediction tasks.

References Powered by Scopus

Momentum Contrast for Unsupervised Visual Representation Learning

11222Citations
N/AReaders
Get full text

Overcoming catastrophic forgetting in neural networks

6020Citations
N/AReaders
Get full text

Extended-connectivity fingerprints

5444Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Applications of transformer-based language models in bioinformatics: A survey

110Citations
N/AReaders
Get full text

A knowledge-guided pre-training framework for improving molecular representation learning

70Citations
N/AReaders
Get full text

Leveraging transformers-based language models in proteome bioinformatics

47Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Li, H., Zhao, D., & Zeng, J. (2022). KPGT: Knowledge-Guided Pre-training of Graph Transformer for Molecular Property Prediction. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 857–867). Association for Computing Machinery. https://doi.org/10.1145/3534678.3539426

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 13

68%

Researcher 4

21%

Professor / Associate Prof. 1

5%

Lecturer / Post doc 1

5%

Readers' Discipline

Tooltip

Computer Science 10

59%

Chemistry 3

18%

Pharmacology, Toxicology and Pharmaceut... 2

12%

Engineering 2

12%

Save time finding and organizing research with Mendeley

Sign up for free