Graph Neural Networks for Multimodal Single-Cell Data Integration

40Citations
Citations of this article
62Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recent advances in multimodal single-cell technologies have enabled simultaneous acquisitions of multiple omics data from the same cell, providing deeper insights into cellular states and dynamics. However, it is challenging to learn the joint representations from the multimodal data, model the relationship between modalities, and, more importantly, incorporate the vast amount of single-modality datasets into the downstream analyses. To address these challenges and correspondingly facilitate multimodal single-cell data analyses, three key tasks have been introduced: Modality prediction, Modality matching andJoint embedding. In this work, we present a general Graph Neural Network framework scMoGNN to tackle these three tasks and show that scMoGNN demonstrates superior results in all three tasks compared with the state-of-the-art and conventional approaches. Our method is an official winner in the overall ranking ofModality prediction from NeurIPS 2021 Competition (https://openproblems.bio/neurips-2021/), and all implementations of our methods have been integrated into DANCE package (https://github.com/OmicsML/dance).

Cite

CITATION STYLE

APA

Wen, H., Ding, J., Jin, W., Wang, Y., Xie, Y., & Tang, J. (2022). Graph Neural Networks for Multimodal Single-Cell Data Integration. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 4153–4163). Association for Computing Machinery. https://doi.org/10.1145/3534678.3539213

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free