relation between entropy of a random variable and another

52 Views Asked by At

Prove that $H(X)\leq H(0.9, 0.1, 0.1, 0.1)$ for any random variable over $\{1, 2, 3, 4\}$ with $\Pr [X = 1] \geq 0.9$.

Any suggestions? Thanks for helpers!

1

There are 1 best solutions below

0
On

The vector given is not a probability vector; but here is the general proof idea.

Let $(p_1,p_2,p_3,p_4)$ be a probability vector. Then note that $$H(p_1,p_2,p_3,p_4) = H_2(p_1) + (1-p_1)H\left(\frac{p_2}{1-p_1},\frac{p_3}{1-p_1},\frac{p_3}{1-p_1}\right)$$ where $H_2(x) = -x \log(x) - (1-x)\log(1-x)$ is the binary entropy function. Since uniform distribution maximizes entropy, we have $$H(p_1,p_2,p_3,p_4) \leq H_2(p_1) + (1-p_1)\log(3),$$ with equality if and only if $p_2=p_3=p_4$.

The derivative of the right-hand-side is $\log\left(\frac{1-p_1}{p_1}\right) - \log(3)$. The rest should be clear.