Propagation of Error in Neural Networks: introducing the sceptic fonction
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper introduces a novel modification to standard backpropagation by incorporating a Skeptic Function, which dynamically adjusts error propagation basedon the distance between activations and class centroids. By computing these distances at each layer, we introduce a mechanism that modulates gradient updates,reducing the influence of ambiguous or overlapping representations. This approachaims to improve class separability while mitigating gradient amplification in deepnetworks. We provide a mathematical formulation of the Skeptic Function, analyzeits computational overhead compared to traditional backpropagation, and presentempirical results demonstrating its impact on training dynamics and representationlearning.