Skip navigation
Search
Sciety application settings
Home
Groups
Explore
Newsletter
About
Log In
Sign Up
A hybrid model based on transformer and Mamba for enhanced sequence modeling
Xiaocui Zhu
Qunsheng Ruan
Sai Qian
Miaohui Zhang
Read the full article
See related articles
Listed in
This article is not in any list yet, why not save it to one of your lists.
Log in to save this article
Abstract
No abstract available
Article activity feed
Version published to 10.1038/s41598-025-87574-8
Apr 3, 2025
Version published to 10.21203/rs.3.rs-4782985/v1 on Research Square
Aug 23, 2024
Related articles
Optimizing Large Language Models for Efficiency: A Dual-Model Architecture with Dynamic Vocabulary Adjustment
This article has 1 author:
Tom Vatland
Token-Level Pruning in Attention Models
This article has 1 author:
Shui Xiuying
AlchemBERT: Exploring Lightweight Language Models for Materials Informatics
This article has 5 authors:
Xiaotong Liu
Yuhang Wang
Tao Yang
Xingchen Liu
Xiao-Dong Wen
Site navigation links
Home
Groups
Explore
Newsletter
About
Log In
Sign Up