Condor: A Neural Connection Network for Enhanced Attention
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The attention mechanism in traditional neural networks relies on pairwise interactions between tokens, limiting its ability to capture complex, multi-token relationships. This study introduces Condor, a novel architecture that extends the attention mechanism through a neural connection network based on the KY Transform theory. Our approach replaces static attention patterns with learnable connection functions that dynamically model relationships within a local window. The Condor architecture achieves linear computational complexity of O(LWH) while maintaining the expressive power to capture sophisticated sequence patterns. Experimental results on wikitext-2 demonstrate improved perplexity and faster convergence compared to the standard Transformer, confirming that each attention head learns unique connection patterns specializing in different aspects of sequence modeling. Code is available at: https://github.com/Kim-Ai-gpu/Condor