Emergence of Sparse Coding, Balance and Decorrelation from a Biologically-Grounded Spiking Neural Network Model of Learning in the Primary Visual Cortex
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Many computational studies attempt to address the question of information representation in biological neural networks using an explicit optimization based on an objective function. These approaches begin with principles of information representation that are expected to be found in the network and from which learning rules can be derived.
This study approaches the question from the opposite direction; beginning with a model built upon the experimentally observed properties of neural responses, homeostasis, and synaptic plasticity. The known properties of information representation are then expected to emerge from this substrate.
A spiking neural model of the primary visual cortex (V1) was investigated. Populations of both inhibitory and excitatory leaky integrate-and-fire neurons with recurrent connections were provided with spiking input from simulated ON and OFF neurons of the lateral geniculate nucleus. This network was provided with natural image stimuli as input. All synapses underwent learning using spike-timing-dependent plasticity learning rules. A homeostatic rule adjusted the weights and thresholds of each neuron based on target homeostatic spiking rates and mean synaptic input values.
These experimentally grounded rules resulted in a number of the expected properties of information representation. The network showed a temporally sparse spike response to inputs and this was associated with a sparse code with Gabor-like receptive fields. The network was balanced at both slow and fast time scales; increased excitatory input was balanced by increased inhibition. This balance was associated with decorrelated firing that was observed as population sparseness. This population sparseness was both the cause and result of the decorrelation of receptive fields. These observed emergent properties (balance, temporal sparseness, population sparseness, and decorrelation) indicate that the network is implementing expected principles of information processing: efficient coding, information maximization (’infomax’), and a lateral or single-layer form of predictive coding.
These emergent features of the network were shown to be robust to randomized jitter of the values of key simulation parameters.