Enhancing Energy Efficiency of Neuromorphic Systems
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Ensuring energy-efficient operation in neuromorphic computing systems requires architectures that are closely aligned with both hardware constraints and learning algorithms. This work focuses on advancing brain-inspired perceptual computing by introducing a novel combined learning strategy for Convolutional Spiking Neural Networks (CSNNs). CSNNs offer a compelling alternative to conventional, power-intensive machine learning approaches such as backpropagation, by leveraging event-driven spiking neuron dynamics inspired by biological neural systems. The proposed learning framework integrates pair-based Spike-Timing-Dependent Plasticity (PSTDP) with power-law-dependent Spike-Timing-Dependent Plasticity (STDP) to modulate synaptic strengths. This combination enables the effective exploitation of stochastic hardware elements, including memristive devices, thereby enhancing energy efficiency while improving perceptual computing performance. By reducing the number of trainable parameters without compromising accuracy, the proposed approach achieves lower energy consumption and reduced area overhead, making it well suited for neuromorphic hardware implementations. This study further explores neuromorphic design architectures centered on CSNNs, presenting a general framework for energy-efficient computing hardware. Multiple CSNN architectures are evaluated to examine the trade-off between model complexity and perceptual accuracy, demonstrating that acceptable performance can be achieved with significantly fewer trainable parameters. The results position CSNNs as strong candidates for energy-efficient neuromorphic architectures, and comparisons with prior work validate the effectiveness and advantages of the proposed methodology.