Synaptic plasticity-based regularizer for artificial neural networks

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Regularization is an important tool for the generalization of ANN models. Due to the lack of constraints, it cannot guarantee that the model will work in a real environment with continuous changes in the distribution. Inspired by neuroplasticity, this paper proposes a bounded regularization method that can be safely activated during the deployment phase. First, we improve the reliability of the outputs of selected neurons by extending our recently proposed neuronal masking. Subsequently, we regularize the model by introducing a synaptic connection module to determine the connection of the masks to their previous layer based on the coming input data. To find the optimal connection, we define a mixed-integer nonlinear programming (MINLP) problem to minimize the loss of prospect uncertainty and solve it using our proposed “single wave” method. Finally, we propose a storage/recovery memory module to memorize these connections along with the corresponding uncertainty level. Experimental results from classification and regression tasks show that the proposed method outperforms the state-of-the-art in the sense of accuracy.

Article activity feed