Proving Pinsker's inequality

1.5k Views Asked by At

I wanna show that for $a$ and $b >0$ and $<1$. Then $$2(a-b)^2\leq (a) \log (a) + (1-a)\log(1-a) -(a)\log(b)-(1-a)\log(1-b).$$ If I wanna go from left to right, then I think I gotta use Taylor's but I don't know how. Also, the right hand side I figured out is equal to $$(a)\log\frac ab+(1-a)\log\frac{1-a}{1-b}.$$

2

There are 2 best solutions below

0
On BEST ANSWER

As already mentioned, this is a special case of Pinsker's inequality. The following proof for this special case is taken from Pinsker’s inequality and its applications to lower bounds:

For fixed $a \in (0, 1)$ define $f:(0, 1) \to \Bbb R$ as $$ f(b) = a\log\frac ab+(1-a)\log\frac{1-a}{1-b} - 2(a-b)^2 \, . $$ Then $$ f'(b) = -\frac ab + \frac{1-a}{1-b} + 4(a-b) = (b-a) \left(\frac{1}{b(1-b)} - 4\right) \, . $$ The second factor is $\ge 0$, with equality only at a single point ($b= \frac 12$), so that $f$ is strictly decreasing on $(0, a]$ and strictly increasing on $[a, 1)$. It follows that $$ f(b) \ge f(a) = 0 \, , $$ which is the desired inequality. Equality holds only if and only if $b=a$.

0
On

You are trying to prove Pinsker's inequality. Since both $a$ and $b$ are between 0 and 1, we can think of $(a,1-a)$ and $(b,1-b)$ as binary probability distributions. Let me call them $P$ and $Q$ respectively, so that $P(0)=a$, $P(1)=1-a$, $Q(0)=b$ and $Q(1)=1-b$.

Now it is easy to see that your left hand side is $2\delta(P,Q)^2$, where $\delta$ is the total variation distance between $P$ and $Q$, while the right hand side is the relative entropy, or Kullback-Leibler divergence, $D(P||Q)$.

You can find the proof of Pinsker's inequality in many places, for example here: Proof of Pinsker's inequality.