Voltage-dependent reversal potentials in spiking recurrent neural networks enhance energy efficiency and task performance

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Spiking recurrent neural networks (SRNNs) rival gated RNNs on various tasks, yet they still lack several hallmarks of biological neural networks. We introduce a biologically grounded SRNN that implements Dale’s law with voltage-dependent AMPA and GABA reversal potentials. These reversal potentials modulate synaptic gain as a function of the postsynaptic membrane potential, and we derive theoretically how they make each neuron’s effective dynamics and subthreshold resonance input-dependent. We trained SRNNs on the Spiking Heidelberg Digits dataset, and show that SRNN with reversal potentials cuts spike energy by up to 4×, while increasing task accuracy. This leads to high-performing Dalean SRNNs, substantially improving on Dalean networks without reversal potentials. Thus, Dale’s law with reversal potentials, a core feature of biological neural networks, can render SRNNs more accurate and energy-efficient.

Article activity feed