Reshaping Reservoirs with Hebbian Plasticity:Unsupervised Adaptation that Works
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Reservoir Computing (RC) is a lightweight way to model time-dependent data, yet its reliance on static, randomly initialized network architectures often limits performance on challenging real-world problems. We introduce Hebbian Architecture Generation (HAG), an unsupervised rule that grows connections between neurons that frequently activate together—embodying the biological maxim “neurons that fire together wire together.” Starting from an almost empty reservoir, HAG progressively sculpts a task-specific wiring. Across a diverse set of classification and forecasting tasks reservoirs reshaped by HAG are consistently more accurate than traditional Echo State Networks and than reservoirs tuned with popular plasticity rules such as Intrinsic Plasticity or anti-Oja learning. In other words, letting the network rewire itself from data turns a once-static RC model into a flexible, high-performance learner without a single gradient step. By coupling the efficiency of RC with the adaptability of Hebbian plasticity, HAG moves reservoir computing closer to its biological inspiration and shows that structural self-organisation is a practical route to robust, task-aware processing of real-world time-series data.