Homeostatic Binary Networks: A simple framework for learning with overlapping patterns

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Memories are rarely stored in isolation: experiences overlap in time and context, leading to neuronal activity patterns that share elements across episodes. While such overlap supports generalization and abstraction, it also increases interference and threatens representational stability. Here we introduce Homeostatic Binary Networks (HBNs), a minimal recurrent framework that combines binary activity, adjustable inhibition, Hebbian learning, and homeostatic plasticity to address these challenges. First, we formalize an Episode Generation Protocol (EGP) that creates compositional episodes with controllable overlap and noise, and define a corresponding semantic structure as conditional probabilities between concepts. We then show analytically and through simulations that recurrent synapses converge to conditional firing probabilities, thereby encoding asymmetric semantic relationships across concepts. These recurrent dynamics enable reliable recall and replay of overlapping episodes without representational collapse. Finally, by incorporating feed-forward plasticity with a neuronal maturity mechanism, output neurons form selective receptive fields in a one-shot manner and refine them through replay, yielding robust unsupervised classification of overlapping episodes. Together, our results demonstrate how simple principles such as neural and synaptic competition can support the stable representation and organization of overlapping memories, providing a mechanistic bridge between episodic and semantic structure in memory systems.

Article activity feed