Ontology highlight
ABSTRACT:
SUBMITTER: Lupo U
PROVIDER: S-EPMC9588007 | biostudies-literature | 2022 Oct
REPOSITORIES: biostudies-literature
Lupo Umberto U Sgarbossa Damiano D Bitbol Anne-Florence AF
Nature communications 20221022 1
Self-supervised neural language models with attention have recently been applied to biological sequence data, advancing structure, function and mutational effect prediction. Some protein language models, including MSA Transformer and AlphaFold's EvoFormer, take multiple sequence alignments (MSAs) of evolutionarily related proteins as inputs. Simple combinations of MSA Transformer's row attentions have led to state-of-the-art unsupervised structural contact prediction. We demonstrate that similar ...[more]