Entropy of the Prime Numbers
The Prime Number Theorem is one of the most famous theorems in mathematics. It tells us something about the distribution of the prime numbers.
How many of the first n integers 1,2,3,4,....,n are prime? The Prime Number Theorem says the number of primes is approximately n/log(n) where the log is base e.
This is not an exact count, n/log(n) is only an approximation, but as n gets bigger the approximation gets better and better.
Let's use the change of base formula for logarithms to convert this to log base 2. So we get n*log(e)/log(n) where the logs are now base 2.
The Prime Number Theorem is also a statement about the Shannon Entropy of the primes! Here's how..
Suppose you have a machine with a big red button. Each time you punch the button the machine responds by displaying an integer in the range 1,2,3,....,n. After much experimentation you discover that the probability of getting integer j is pj. Then physics defines the Shannon Entropy (also called Information Entropy) of this machine as..
Shannon Entropy=(-1)*sum (pj*log(pj)) for j=1,2,3,...,n and the log is base 2.
In the special case where all numbers occur with equal probability pj=1/n for all j and we get the famous result for the Shannon Entropy of the machine..
Now imagine this is "distributed" equally across all numbers, so on average an individual integer has log(n)/n entropy.
If the integers 1,2,3,....,n contain m primes then the Shannon Entropy of the primes is simply m*log(n)/n
But the prime number theorem says that m=n*log(e)/log(n) approximately. So the approximate Shannon Entropy becomes..
and as n approaches infinity this approximation becomes exact. So we can say..
"The Shannon Entropy of the Primes is log(e)=1.442695.."
This statement is equivalent to the prime number theorem. How strange!
Content written and posted by Ken Abbott email@example.com