Graph attention with structural features improves the generalizability of identifying functional sequences at a protein interface

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Accurate prediction of the set of sequences compatible with a protein-protein interface is an unsolved problem in biology. While supervised sequence-based models trained directly on experimental data can predict variant effects, they often fail to generalize to significantly diverged sequences. We hypothesized that incorporating information from deep learning models of proteins (e.g., ESM, ProteinMPNN) could enhance generalizability. To test this hypothesis, we designed and experimentally screened several deep mutational libraries of the SARS-CoV-2 Spike Receptor Binding Domain (RBD) for binding to the ACE2 receptor. Our large dataset encompasses over 43,000 sequence variants, exhibiting up to 26 substitutions away from the parental RBD sequence, thus exploring a significantly expanded sequence space compared to previous studies. Baseline supervised learning with one-hot encoded sequences achieved high accuracy within training sets but poor performance on unseen libraries. Integrating pre-trained protein model embeddings (ESM2) as a feature showed modest improvement in generalization. To further enhance predictive power, we developed a graph attention network architecture that combines representations of local residue environments using protein structure graphs with long-range inter-residue correlations captured by protein language model (PLM) embeddings (GAN-PLM). By explicitly modeling residue environments, interface geometry, and sequence dependencies, our graph attention model outperformed purely sequence-based models, achieving substantially higher balanced accuracies when predicting functional ACE2-binding variants across the diverse sequence space spanned by our independent libraries. This demonstrates the potential of structure- and sequence-based features into deep learning frameworks to achieve accurate and generalizable predictions of protein interface function, with broad implications for understanding and engineering protein interactions relevant to emerging infectious diseases and therapeutic protein design.

Article activity feed