Tensor Logic of Embedding Vectors in Neural Networks
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Current Artificial Neural Networks based on Large Language Models (LLMs) primarily use statistical token prediction, often lacking rigorous structural semantic consistency and illocutionary force. This paper introduces the \textbf{Tensor Functional Language Logic (T-FLL)} as a formal bridge between symbolic reasoning and continuous neural manifolds. We redefine linguistic units as functional noemes and propose a mapping of logical operators onto tensor operations. Sentences are translated into \emph{noematic formulae}, and we show that the attention mechanism driving the semantics of a dialog can be reformulated more efficiently if directed by the noematic formulae. In this way, we outline a path toward more explainable and structurally sound AI architectures.