Homeostatic Binary Networks: A simple framework for learning with overlapping patterns
Albesa-Gonzalez, A.; Clopath, C.
AbstractMemories are rarely stored in isolation: experiences overlap in time and context, leading to neuronal activity patterns that share elements across episodes. While such overlap supports generalization and abstraction, it also increases interference and threatens representational stability. Here we introduce Homeostatic Binary Networks (HBNs), a minimal recurrent framework that combines binary activity, adjustable inhibition, Hebbian learning, and homeostatic plasticity to address these challenges. First, we formalize an Episode Generation Protocol (EGP) that creates compositional episodes with controllable overlap and noise, and define a corresponding semantic structure as conditional probabilities between concepts. We then show analytically and through simulations that recurrent synapses converge to conditional firing probabilities, thereby encoding asymmetric semantic relationships across concepts. These recurrent dynamics enable reliable recall and replay of overlapping episodes without representational collapse. Finally, by incorporating feed-forward plasticity with a neuronal maturity mechanism, output neurons form selective receptive fields in a one-shot manner and refine them through replay, yielding robust unsupervised classification of overlapping episodes. Together, our results demonstrate how simple principles such as neural and synaptic competition can support the stable representation and organization of overlapping memories, providing a mechanistic bridge between episodic and semantic structure in memory systems.