E-SKAN: Breaking the Efficiency-Accuracy Frontier in Neuromorphic Computing via Event-Driven Kolmogorov-Arnold Networks

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Spiking Neural Networks (SNNs) offer a promising path toward energy-efficient AI, but they tradition- ally require large parameter counts to match the accuracy of conventional networks. Kolmogorov-Arnold Networks (KANs) provide interpretable, parameter-efficient representations through learnable spline functions, yet their continuous computation requirements seem fundamentally incompatible with the discrete, sparse nature of SNNs. We introduce E-SKAN (Event-Driven Spiking Kolmogorov-Arnold Networks), a novel architecture that bridges this gap. Our key insight is that synaptic traces decay slowly, enabling us to skip redundant spline recomputations when trace changes fall below a threshold δ. This restores computational sparsity to the KAN framework. On MNIST, E-SKAN achieves 97.94% accuracy with 24% fewer parameters (179K vs 235K) compared to baseline SNN. On N-MNIST (neuromorphic event-based data), E-SKAN achieves 94.00% accuracy with 40% fewer parameters (375K vs 626K). Our validation confirms delta-gating witha mean trace change of 0.02, well below the δ = 0.05 threshold. E-SKAN represents the first architecture to simultaneously improve accuracy and parameter efficiency over standard SNNs on both static and event-based neuromorphic data.

Article activity feed