Suppose $X, Y, Z$ are three discrete random variables.
Is there a good sufficient and necessary condition for $I(X;Y|Z) = I(X;Y)$?
Usually the LHS can be bigger or smaller than the RHS, but if Z is constant, then they are equal. But this is a rather trivial observation.
Motivation
Conditional mutual information generalizes both (unconditional) mutual information and conditional entropy. For the latter point, one can make the following trivial observation and the relevant sufficient and necessary condition.
(trivial observation) $I(X;Y|Z) = H(X|Z)$ if knowing Y contributes large enough, to be precise, for example, if X is a function of Y. Therefore conditional mutual information generalizes conditional entropy.
(sufficient and necessary condition) $I(X;Y|Z) = H(X|Z)$ if and only if X is a function of $(Y,Z)$.
For the former point, I can't seem to find any meaningful sufficient and necessary condition.
If $X$ and $Y$ are independent from $Z$, then $I(X;Y|Z)=I(X;Y)$.