AlphaFold2’s training set powers its predictions of fold-switched conformations
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
AlphaFold2 (AF2), a deep-learning based model that predicts protein structures from their amino acid sequences, has recently been used to predict multiple protein conformations. In some cases, AF2 has successfully predicted both dominant and alternative conformations of fold-switching proteins, which remodel their secondary and tertiary structures in response to cellular stimuli. Whether AF2 has learned enough protein folding principles to reliably predict alternative conformations outside of its training set is unclear. Here, we address this question by assessing whether CFold–an implementation of the AF2 network trained on a more limited subset of experimentally determined protein structures– predicts alternative conformations of eight fold switchers from six protein families. Previous work suggests that AF2 predicted these alternative conformations by memorizing them during training. Unlike AF2, CFold’s training set contains only one of these alternative conformations. Despite sampling 1300-4400 structures/protein with various sequence sampling techniques, CFold predicted only one alternative structure outside of its training set accurately and with high confidence while also generating experimentally inconsistent structures with higher confidence. Though these results indicate that AF2’s current success in predicting alternative conformations of fold switchers stems largely from its training data, results from a sequence pruning technique suggest developments that could lead to a more reliable generative model in the future.