Solve an inequality involving entropy of random variable X

108 Views Asked by At

Im working through Elements of Information Theory and came across a question which is stumping me. The problem states Let $p(x)$ be a probability mass function. Prove, for all $d\ge0$ that $$\text{Pr}(p(X)\le d)\log\frac1d\le H(X)$$ Where $H(X)$ is an entropy. The presence of the $log\frac1d$ term makes me believe that this is related to a change of base as $H_b(X)=\log_b(a)H_a(X)$ but I can't see why the Pr$()$ term would be related to entropy in any way. Any help on the actual method to solving this is appreciated!

1

There are 1 best solutions below

3
On BEST ANSWER

This is a consequence of Markov's inequality, which says the following: let $X$ be a nonnegative random variable and $a>0$, then $Pr(X \geq a) \leq \mathbb{E}[X]/a$. In this case, let the random variable $X$ be $-\log p(X)$. Then an application of Markov's inequality gives, for all $a>0$:

$$P(-\log p(X) \geq a) \leq \frac{\mathbb{E}[-\log p(X)]}{a} = \frac{H(X)}{a}$$

Now, set $a = \log \frac{1}{d}$, so that we get

$$P(-\log p(X) \geq \log \frac{1}{d})\log \frac{1}{d} \leq H(X)$$.

Finally, notice that $-\log p(X) \geq \log \frac{1}{d}$ is equivalent to $p(X) \leq d$.