Absolute value of difference between entropies (of two distributions)

128 Views Asked by At

I have the following inequality for the $L_1$ distance between two distributions $Q$, $Q^n$ on a finite set $B$:

$$\|Q-Q^n|| < \frac{2|B|}{n}\leq \frac{C}{n} \leq \frac12 $$

Assuming $C\geq2|B|$, $n\geq|B|$ and the $L_1$ distance is defined as $\|Q-Q^n\| =\sum_{b\in B}|Q(b)-Q^n(b)| $

Here, $Q^n$ is an empirical distribution on $B$ that is induced by a sequence $(x_i)$ of length $n$ where $x_i\in B$ $\forall i$. Now, I don't know how one can show the following using the above inequality:

$$|H(Q)-H(Q^n)|<\frac{2C\log(n)}{n}$$

where $H(Q)$ and $(H(Q^n))$ is the entropy w.r.t. $Q$ and $(Q^n)$.

I tried several ways, it did not work out. Does anyone have any idea, or is there some result (other than the inequality $H(Q)\leq \log_2(|B|)$) I could use here?

Thank you!