Stochastic activity in low-rank recurrent neural networks
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The geometrical and statistical properties of brain activity depend on the way neurons connect together to form recurrent circuits. How the structure of connectivity shapes the emergent activity remains however not fully understood. We investigate this question in recurrent neural networks with linear additive stochastic dynamics. We assume that the synaptic connectivity can be expressed in a low-rank form, parameterized by a handful of connectivity vectors, and and examine how the geometry of emergent activity relates to these vectors. Our findings reveal that this relationship critically depends on the dimensionality of the external stochastic inputs. When inputs are low-dimensional, activity is confined to a low-dimensional subspace spanned by a subset of the connectivity vectors, whose dimensionality matches the rank of the connectivity matrix, along with the external inputs. Conversely, when inputs are high-dimensional, activity is generally high-dimensional; recurrent dynamics shape activity within a subspace spanned by all connectivity vectors, with a dimensionality equal to twice the rank of the connectivity matrix. Applying our formalism to excitatory-inhibitory circuits, we discuss how inputs geometry also play a crucial role in determining the amount of input amplification generated by non-normal dynamics. Our work provides a foundation for studying activity in structured brain circuits under realistic noise conditions, and offers a framework for interpreting stochastic models inferred from experimental data.