DGRNA: a long-context RNA foundation model with bidirectional attention Mamba2

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Ribonucleic acid (RNA) is an important biomolecule with diverse functions i.e. genetic information transfer, regulation of gene expression and cellular functions. In recent years, the rapid development of sequencing technology has significantly enhanced our understanding of RNA biology and advanced RNA-based therapies, resulting in a huge volume of RNA data. Data-driven methods, particularly unsupervised large language models, have been used to automatically hidden semantic information from these RNA data. Current RNA large language models are primarily based on Transformer architecture, which cannot efficiently process long RNA sequences, while the Mamba architecture can effectively alleviate the quadratic complexity associated with Transformers. In this study, we propose a large foundational model DGRNA based on the bidirectional Mamba trained on 100 million RNA sequences, which has demonstrated exceptional performance across six RNA downstream tasks compared to existing RNA language models.

Article activity feed