Is max entropy invariant under choice of base for entropy?

24 Views Asked by At

Suppose I have a set of probability mass functions $S_1,\dots,S_n$ and I find that $S_j$ has the greatest entropy using the traditional formula for entropy using $H(S) = \sum_{i\in S}-p_i \log_2(p_i)$.

In this case we're using a base of 2. Is it guaranteed that if instead I used any other base greater than 1 that I would find that $S_j$ has the greatest entropy?

Seems like this would be a needed property in information theory, otherwise it seems like it would be hard to argue we're interested in dividing the space into halves instead of thirds or fifths etc...

1

There are 1 best solutions below

1
On BEST ANSWER

Note that $\log_a x=\frac{\ln x}{\ln a}$ so that changing the log base just changes the overall result by a constant factor.