StyleMamba: Efficient Image Style Transfer with Bidirectional Selective Scan Vision Mamba
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Style transfer aims to render stylized images withartistic features while maintaining the original content.Traditional CNN-based approaches have limitations indealing with global information and long-range dependencies in style transfer. Existing transformer-basedapproaches, while mitigating these problems well, facehigh computational costs. Mamba addresses these limitations using a selective structured state-space model(S4), which maintains linear complexity while effectivelyhandling long-range dependencies. In this pa-per, wepropose StyleMamba, an efficient image style transfer architecture based on a Bidirectional Selective Scanmechanism, to address the challenge of balancing local and global dependencies with the computational efficiency of existing methods through spatial dyadic statespace modeling. In addition, we design Dynamic GateFusion (DGF) to fuse dual-path outputs based on feature relevance adaptively. Performing qualitative andquantitative experiments on representative datasets, wedemonstrate the advantages of our model over otherstate-of-the-art(SOTA) transformer-based methods anddifferent approaches.