Asymptotic Equivalence of Bit-Based Prime Complexity and Shannon Entropy

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

\begin{abstract}We propose a novel information-theoretic interpretation of arithmetic structure by introducing a finite-bit formalism grounded in constructibility and information invariance. In this framework, the bit content $\mathfrak{B}(n)$ of an integer $n$ is defined as the total number of prime factors with multiplicity, coinciding with the classical prime Omega function $\Omega(n)$. We show that the average bit content over integers up to $n$ is asymptotically equivalent to the Shannon information of detecting a prime number near $n$. Specifically, we prove that:\[\lim_{n \to \infty} \mathbb{E}[\Omega(n)] =\ln2 \lim_{n \to \infty} \logtwo \log n = \ln2\lim_{n \to \infty} I(p_n)\]where $I(p_n)$ denotes the Shannon information associated with the probability of primality near $n$. This result reveals a structural correspondence between the arithmetic complexity of integers and the informational cost of primality, offering a new unifying perspective on classical number theory, logic, and information. The interpretation also suggests that bit-invariant arithmetic encodes asymptotic entropy, bridging constructible mathematics with probabilistic information theory.\end{abstract}

Article activity feed