Lower bound for joint entropy

439 Views Asked by At

Let
$ H(X_{1})\leqslant H(X_{2})\leqslant,...,\leqslant H(X_{n})$

I seem to have a problem of finding a lower bond of joint entropy. I proved that upper bound using the chain rule and the fact that information of discrete variables is positive.And I obtained result that

$H(X_{1}, X_{2},..., X_{n}) \leqslant $$\sum_{k=1}^{n} H(X_{k})$

Where should I start? I found result that lower bond is $max (H(X_{k}))$?

1

There are 1 best solutions below

0
On BEST ANSWER

THe property $H(X_1,X_2) \ge H(X_1)$ should be obvious from the interpretation of entropy, as information content (the information provided by $X_1$ and $X_2$ together cannot be less than that provided by $X_1$ alone). And it can also be proved by the chain rule:

$$H(X_1,X_2) = H(X_1)+H(X_2 | X_1) \ge H(X_1)$$ because $H(X_2 | X_1) \ge 0$ (as any entropy).

We also can write $H(X_1,X_2) \ge H(X_2)$. Hence $H(X_1,X_2) \ge \max(H(X_1),H(X_2))$

You only need to generalize this to $n$ variables.