Suppose I have two systems $A$ and $B$ that produces the numbered tiles. System $A$ produces tiles 1, 2, 3, 4, and 5 with the probabilities:
P(1) = 0.5
P(2) = 0.2
P(3) = 0.2
P(4) = 0.07
P(5) = 0.03
System $B$ produces tiles 10, 11, 12, ... 100 with the probabilities:
P(10) = 0.08
P(11) = 0.05
P(12) = 0.09
P(13) = 0.05
P(14) = 0.05
P(15) = 0.01
P(N) = something similarly small
Can I directly compare the entropy of these two systems using $$H(X) = -\Sigma_{i=1}^n P(x_i)log_2(P(x_i))$$
Is b-ary entropy relevant here? Or is that something totally different?
The fact is that Entropy increase with the number of states, so you can not use that formula to compare those entropies. A simple thing you can do, but not the best is normalize both Entropies and then compare. The normalization is by dividing the formula by log N that is the maximum entropy for a system with N states.
$H_{max} = -1/N \sum_{i=1}^N log (1/N) $
$H_{max} = log (1/N) $
So the normalized entropy is:
$H_{N} = \frac{-\sum_{i=1}^N P(x_i) log (P(x_i)}{log N} $
so for system A, the maximum entropy is for the discrete uniform distribution (i.e $P(X_i) = 1/5 ~~ \forall i$)
And you can compare between the two normalized entropies.
This is a fair approach, but it depends what you are doing you can improve this with other measures.