Shannon Entropy - Explained in 90 Seconds
Shannon Entropy (also called Information Entropy) is a concept used in physics and information theory. Here's the scoop..
Suppose you have a system with n states i.e. whenever you make an observation of the system you find it's always in one of the n possible states.
Now make a large number of observations of the system, then use them to get the probability pi that if you make an observation the system is in state i. So for every state of the system you have a probability pi.
Now construct this crazy sum = p1*log(p1) + p2*log(p2) +... + pn*log(pn) where the sum is over all the states of the system.
If the log is base 2 then (-1)*sum is called the "information entropy" of the system.
Note that "information entropy" applies to a complete system, not individual states of a system.
Here's a simple example..
My system is a penny and a table.
I define the system to have 2 states.. penny lying stationary on the table with heads up or with tails up.
My experiment is to throw the penny and then observe which state results.
I throw the penny many times and make notes. It lands heads up 1% of the time and tails up 99% of the time (it's biased).
The crazy sum is 0.01*log(0.01) + 0.99*log(0.99) = 0.01*(-6.643856) + 0.99*(-0.0145) = -0.08079356
So the information entropy of the system is (-1)*(-0.08079356) = 0.08079356
Content written and posted by Ken Abbott email@example.com