Experiential neural architecture selection: dynamic cross-layer memory for real-time inference optimization

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Neural networks suffer from operational amnesia: they process each input as if it were the first time, without remembering which neuron combinations proved effective in similar contexts. We introduce ExNAS ( Experiential Neural Architecture Selection ), a system that performs real-time, neuron-granular architectural adaptation during the same inference by leveraging a distributed experiential memory. ExNAS records layer-wise neural fingerprints and lightweight contextual metadata and then performs transversal selection across non-consecutive layers under explicit per-layer and global budgets. On a CPU proof-of-concept using a small CNN (2×Conv+FC), ExNAS delivers measurable time reductions (≈3.7–7.9%) and throughput gains (≈3.8–8.5%) at low active fractions (≈4.7–10.9%), without retraining. We detail the design, provide formal definitions, and discuss sensitivity to budgets and a negative case where heavier adaptation adds overhead. These results substantiate experience-guided, neuron-level conditional computation as a practical tool for real-time inference.

Article activity feed