Consider three random variables $X, Y, Z$ satisfying $X \perp Z \mid Y$. By the data processing inequality: $I(X;Z) \leq I(X;Y)$.
Now, consider the alternative setting where $Y_\epsilon = Y + \epsilon_1$ and $Z_\epsilon = Z + \epsilon_2$ for independent zero-mean noise variables $\epsilon_1, \epsilon_2$. Denote by $\mathcal{Y}_\epsilon$ a set of independent "copies" of $Y_\epsilon$ and assume that $\mathrm{Var}[Y \mid \mathcal{Y}_\epsilon] \leq \tau$ for some arbitrarily small $\tau > 0$.
In general, we have that $X \not\perp Z_\epsilon \mid \mathcal{Y}_\epsilon$. However, intuitively, the conditional independence should still hold "approximately", and the "information gain" about $X$ of the noisy $Z_\epsilon$ should be small. Is it possible to give a bound of $I(X;Z_\epsilon)$ in terms of $I(X;\mathcal{Y}_\epsilon)$ similar to the data processing inequality?
We have (in general) $$I(X;Y) - I(X;Z) = H(X|Z) -H(X|Y) \ge H(X|Y,Z)-H(X|Y)$$
But here, in the original setting, we have $ H(X|Y,Z)=H(X|Y)$, from which the inequality follows.
Let introduce now $A=\mathcal{Y}_\epsilon$, $B=Z_e$
Let $\alpha = H(Y|A)=H(Y)+H(A|Y)-H(A)$ and $\beta = H(Z|B)= H(Z)+H(B|Z)-H(B)$
We expect these numbers to be small.
Also, let define $ \alpha' = \alpha - H(Y|X,A) \le \alpha$ and $ \beta' = \beta - H(Z|X,B) \le \beta$
Now $$ \begin{align} I(X;A)-I(X;B) &= H(X|B)-H(X|A)\\ &=H( X| Z, B) + H(Z|B) - H(Z| X, B) - [H(X |Y,A) + H(Y|A) - H(Y|X,A) ] \\ &=H(X |Z) - H(X|Y) + \beta' - \alpha'\\ &=I(X;Y) - I(X;Z) + \beta' - \alpha' \\ & \ge \beta' - \alpha' \end{align} $$