A fully-open structure-guided RNA foundation model for robust structural and functional inference

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

RNA language models have achieved strong performances across diverse down-stream tasks by leveraging large-scale sequence data. However, RNA function is fundamentally shaped by its hierarchical structure, making the integration of structural information into pre-training essential. Existing methods often depend on noisy structural annotations or introduce task-specific biases, limiting model generalizability. Here, we introduce structRFM, a structure-guided RNA foundation model that is pre-trained on millions of RNA sequences and secondary structures data by integrating base pairing interactions into masked language modeling through a novel pair matching operation. The structure-guided mask and nucleotide-level mask are further balanced by a dynamic masking ratio. structRFM learns joint knowledge of sequential and structural data, producing versatile representations, including classification-level, sequence-level, and pair-wise matrix features, that support a broad spectrum of downstream adaptations. structRFM ranks among the top models in zero-shot homology classification across fifteen biological language models, and sets new benchmarks for secondary structure prediction. structRFM further derives Zfold, which enables robust and reliable tertiary structure prediction, with consistent wimprovements in estimating 3D structures and their accordingly extracted 2D structures, achieving a pronounced 19% performance gain compared with AlphaFold3 on RNA Puzzles dataset. In functional tasks such as internal ribosome entry site identification, structRFM achieves a whopping 49% performance gain in F1 score. These results demonstrate the effectiveness of structure-guided pre-training and highlight a promising direction for developing multi-modal RNA language models in computational biology. To support the broader scientific community, we have made the 21-million sequence-structure dataset and the pre-trained structRFM model fully open-source, facilitating the development of multimodal foundation models in biology.

Article activity feed