Transforming the Language of Life: Transformer Neural Networks for Protein Prediction Tasks

79Citations
Citations of this article
128Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The scientific community is rapidly generating protein sequence information, but only a fraction of these proteins can be experimentally characterized. While promising deep learning approaches for protein prediction tasks have emerged, they have computational limitations or are designed to solve a specific task. We present a Transformer neural network that pre-Trains task-Agnostic sequence representations. This model is fine-Tuned to solve two different protein prediction tasks: protein family classification and protein interaction prediction. Our method is comparable to existing state-of-The-Art approaches for protein family classification while being much more general than other architectures. Further, our method outperforms all other approaches for protein interaction prediction. These results offer a promising framework for fine-Tuning the pre-Trained sequence representations for other protein prediction tasks.

Cite

CITATION STYLE

APA

Nambiar, A., Heflin, M., Liu, S., Maslov, S., Hopkins, M., & Ritz, A. (2020). Transforming the Language of Life: Transformer Neural Networks for Protein Prediction Tasks. In Proceedings of the 11th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics, BCB 2020. Association for Computing Machinery, Inc. https://doi.org/10.1145/3388440.3412467

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free