Let $X,Y,Z$ be three random variables. Assume that $Y$ and $Z$ are binary. $I(A;B)$ is the mutual information between the random variables $A$ and $B$. I'm curious if there is a bound on $I(X;Y)$ in terms of $I(X;Z)$ and the probability of $Y=Z$, where the bound equals $I(X;Y)$ for $Y=Z$ with probability $1$.
Thanks
First we need to establish the following triangle-like inequality: $$H(A|B) \le H(A|C) + H(C|B),$$ which is true for all jointly distributed $A, B, C$. Indeed, due to chain rule we have: $$H(A|B) \le H(A, C|B) = H(A|C, B) + H(C|B) \le H(A|C) + H(C|B).$$
Now, due to Fano's inequality https://en.wikipedia.org/wiki/Fano%27s_inequality since $Y$ and $Z$ are binary we have: $$H(Y|Z), H(Z|Y) \le h(\Pr[Y\neq Z]),$$ where $h(x) = x\log_2(1/x) + (1 - x)\log_2(1/(1 - x))$ is a binary entropy.
Now notice that $$I(X:Y) - I(X:Z) = H(X|Z) - H(X|Y) \le H(Y|Z) \le h(\Pr[Y\neq Z]),$$ $$I(X:Z) - I(X:Y) = H(X|Y) - H(X|Z) \le H(Z|Y) \le h(\Pr[Y\neq Z]),$$ that is $$|I(X:Y) - I(X:Z)| \le h(\Pr[Y\neq Z]),$$ and the latter tends to $0$ as $\Pr[Y\neq Z] \to 0$.