How to compare the entropy of two systems?

1.9k Views Asked by At

Suppose I have two systems $A$ and $B$ that produces the numbered tiles. System $A$ produces tiles 1, 2, 3, 4, and 5 with the probabilities:

P(1) = 0.5
P(2) = 0.2
P(3) = 0.2
P(4) = 0.07
P(5) = 0.03

System $B$ produces tiles 10, 11, 12, ... 100 with the probabilities:

P(10) = 0.08
P(11) = 0.05
P(12) = 0.09
P(13) = 0.05
P(14) = 0.05
P(15) = 0.01
P(N) = something similarly small

Can I directly compare the entropy of these two systems using $$H(X) = -\Sigma_{i=1}^n P(x_i)log_2(P(x_i))$$

Is b-ary entropy relevant here? Or is that something totally different?

2

There are 2 best solutions below

8
On

The fact is that Entropy increase with the number of states, so you can not use that formula to compare those entropies. A simple thing you can do, but not the best is normalize both Entropies and then compare. The normalization is by dividing the formula by log N that is the maximum entropy for a system with N states.

$H_{max} = -1/N \sum_{i=1}^N log (1/N) $

$H_{max} = log (1/N) $

So the normalized entropy is:

$H_{N} = \frac{-\sum_{i=1}^N P(x_i) log (P(x_i)}{log N} $

so for system A, the maximum entropy is for the discrete uniform distribution (i.e $P(X_i) = 1/5 ~~ \forall i$)

And you can compare between the two normalized entropies.

This is a fair approach, but it depends what you are doing you can improve this with other measures.

3
On

Yes, of course it's right to compare the entropies, as long as you use the same base (I'm not sure what you mean by $b-ary$ entropy?, I guess you meant base $2$, i.e., entropy ).

To start getting the idea, you might consider the case of a source $A$ with, say, 4 equiprobable outputs and a second source $B$ with (say) 64 equiprobable outputs. Then, you get that the entropies are respectively $H_A=2 $ bits , $H_B=6$ bits. Which makes sense, as that's the average amount of bits you need to represent an output in each case.