Prime Numbers and Information Entropy

Prime Numbers and Information Entropy

Information entropy (also called Shannon Entropy) is a concept used in physics and information theory, so it's a surprise to see it related to prime numbers!

Suppose you have a machine with a big red button. Each time you punch the button the machine responds by displaying an integer in the range 1,2,3,....,n. After much experimentation you discover that the probability of getting integer j is pj. Then physics defines a quantity called the information entropy of this machine as..

information entropy=(-1)*sum pj*log(pj) where the sum is for j=1,2,3,...,n

In the special case where all numbers occur with equal probability pj=1/n for all j and we get the famous result for the information entropy of the machine..

information entropy=(-1)*n*(1/n)*log(1/n)=log(n)

Now imagine all numbers contribute equally to this entropy, so on average an individual integer contributes log(n)/n entropy.

If the integers 1,2,3,....,n contain m prime numbers then the entropy of the prime numbers is simply m*log(n)/n

But a famous result in mathematics called the Prime Number Theorem says m=n/log(n) approximately, so the approximate information entropy of the prime numbers is..

information entropy=m*log(n)/n=1

As n approaches infinity the Prime Number Theorem approximation becomes exact. So we can say that..

"The Shannon Entropy of the Prime Numbers is 1".

Or to put it another way..

"Prime numbers are distributed in such a way as to ensure their Shannon Entropy is 1.

Content written and posted by Ken Abbott