Element-wise Multiplicative Interactions in Neural Networks: Theory, Advances, and Open Problems
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The Hadamard product, or element-wise multiplication, has emerged as a fundamental operation in modern deep learning architectures, far beyond its origins in linear algebra. This survey provides a comprehensive overview of the Hadamard product's role across neural network models, with a particular focus on its use in gating mechanisms, attention modules, feature fusion strategies, and meta-learning systems. We begin by introducing the mathematical formulation and basic properties of the Hadamard product, and then explore its integration into deep learning through numerous architectural motifs. Recent advances reveal creative and powerful extensions of the operation, including learnable modulations, bilinear and trilinear interactions, stochastic variants, and structured sparsity mechanisms. Despite its advantages in computational efficiency and model flexibility, the Hadamard product also presents critical challenges, such as expressive limitations, dimensional rigidity, optimization instability, and a lack of theoretical grounding. This survey not only highlights these challenges but also outlines promising future research directions aimed at enhancing the utility, interpretability, and theoretical understanding of element-wise multiplicative interactions in deep learning. Our goal is to establish the Hadamard product as a first-class citizen in the design and analysis of next-generation neural architectures.