Propagation of Error in Neural Networks: introducing the sceptic fonction

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper introduces a novel modification to standard backpropagation by incorporating a Skeptic Function, which dynamically adjusts error propagation basedon the distance between activations and class centroids. By computing these distances at each layer, we introduce a mechanism that modulates gradient updates,reducing the influence of ambiguous or overlapping representations. This approachaims to improve class separability while mitigating gradient amplification in deepnetworks. We provide a mathematical formulation of the Skeptic Function, analyzeits computational overhead compared to traditional backpropagation, and presentempirical results demonstrating its impact on training dynamics and representationlearning.

Article activity feed