Noise and Dynamical Synapses as Optimization Tools for Spiking Neural Networks

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Standard ANNs lack flexibility when handling corrupted input due to their fixed structure. In this paper, a spiking neural network utilizes biological temporal coding features in the form of noise-induced stochastic resonance and dynamical synapses to increase the model’s performance when its parameters are not optimized for a given input. Using the analog XOR task as a simplified convolutional neural network model, this paper demonstrates two key results: (1) SNNs solve the problem that is linearly inseparable in ANN with fewer neurons, and (2) in leaky SNNs, the addition of noise and dynamical synapses compensate for non-optimal parameters, achieving near-optimal results for weaker inputs.

Article activity feed