Bounding entropy difference using relative entropy

153 Views Asked by At

Can one directly bound the entropy difference $|H(p) - H(q)|$ for distributions $p$ and $q$ using the relative entropy $D(p||q)$, without first going through the Pinsker's inequality and using the continuity bound for entropy?

This is interesting to me because both of these seem to be "natural" quantities in information theory and it is not clear to me that one needs to go through one-norm to prove bounds on $|H(p) - H(q)|$. If one cannot create a (much) better bound directly using $D(p||q)$, I'd like to understand why.