A Mean-Field Approach to Criticality in Spiking Neural Networks for Reservoir Computing

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Reservoir computing is a neural network paradigm for processing temporal data by exploiting the dynamics of a fixed, high-dimensional system, enabling efficient computation with reduced complexity compared to fully trainable recurrent networks. This work presents an analytical framework for configuring in the critical regime a reservoir based on spiking neural networks with a highly general topology. Specifically, we derive and solve a mean-field equation that governs the evolution of the average membrane potential in leaky integrate-and-fire neurons, and provide an approximation for the critical point. This framework reduces the need for an extensive online fine-tuning, offering a streamlined path to near-optimal network performance from the outset. Through extensive simulations, we validate the theoretical predictions by analyzing the network’s spiking dynamics and quantifying its computational capacity using the information-based Lempel-Ziv-Welch complexity near criticality. Finally, we explore self-organized quasi-criticality by implementing a local learning rule for synaptic weights, demonstrating that the network’s dynamics remain close to the theoretical critical point. Beyond AI, our approach and findings also have significant implications for computational neuroscience, providing a principled framework for quantitatively understanding how biological networks leverage criticality for efficient information processing.

Article activity feed