Let $p_1(\cdot), p_2(\cdot)$ be two discrete distributions on $\mathbb{Z}.$ Total variation distance is defined as $d_{TV}(p_1,p_2)= \frac{1}{2} \displaystyle \sum_{k \in \mathbb{Z}}|p_1(k)-p_2(k)|$ and Shannon entropy is defined the usual way, i.e $$ H(p_1)=\sum_k p_1(k) \log(\frac{1}{p_1(k)}) $$ Binary entropy function $h(\cdot)$ is defined by $h(x)=x \log(1/x)+(1-x)\log(1/1-x), \ \forall x \in (0,1)$
I am trying to prove that $H(\frac{p_1+p_2}{2})-\frac{1}{2}H(p_1)-\frac{1}{2}H(p_2) \leq h (d_{TV}(p_1,p_2)/2)$. Can anyone guide me in this direction ?