Modeling Hierarchical Functional Brain Networks via L2 -Normalized Fully Convolutional Recurrent Attention Autoencoder for Multi-task fMRI Data
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Modeling functional brain networks is essential for revealing the functional mechanisms of the human brain. Deep Neural Network (DNN) models have widely been employed for extracting multi-scale spatiotemporal features from functional Magnetic Resonance Imaging (fMRI) data. Nonetheless, existing DNN approaches often struggle with capturing generic temporal features across diverse tasks due to their reliance on fully connected layers. To overcome this limitation, this paper introduces a novel framework based on a L2 -Normalized Fully Convolutional Recurrent Attention Autoencoder (L2-FCRAAE), designed to model hierarchical functional brain networks (FBNs). The L2-FCRAAE framework is engineered with fully convolutional recurrent layers that notably exclude any fully connected layers. This design choice ensures adaptability to variable-length fMRI data, facilitating the acquisition of temporal dependencies within sequential data. Consequently, it effectively models temporal dynamics and helps in recognizing brain states from fMRI data. Moreover, the incorporation of normalized temporal and channel attention blocks into the encoder can prevent model overfitting during training, thereby enhancing the model's representation capacity. Experimental outcomes showcase that the proposed L2-FCRAAE exhibits superior capability and generalizability in capturing spatial and temporal patterns of FBNs. It robustly identifies task-related components and resting-state brain networks (RSNs) in a hierarchical manner. Overall, this study presents a novel approach for understanding the hierarchical organization of functional brain architecture. If this paper is accepted and the code is published.