Transformer Neural Networks Attending to Both Sequence and Structure for Protein Prediction Tasks

06/17/2022
by   Anowarul Kabir, et al.
0

The increasing number of protein sequences decoded from genomes is opening up new avenues of research on linking protein sequence to function with transformer neural networks. Recent research has shown that the number of known protein sequences supports learning useful, task-agnostic sequence representations via transformers. In this paper, we posit that learning joint sequence-structure representations yields better representations for function-related prediction tasks. We propose a transformer neural network that attends to both sequence and tertiary structure. We show that such joint representations are more powerful than sequence-based representations only, and they yield better performance on superfamily membership across various metrics.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset