Let $X,Y$ and $Z$ are three random variables. We know that conditional mutual information might be greater than or less than the unconditional mutual-information i-e $$I(X;Y|Z) \ge I(X;Y) $$ $$I(X;Y|Z) \le I(X;Y). $$
Question: if $Y$ and $Z$ are mutually dependent and/or similarly $X$ and $Z$ are mutually dependent too, would that be sufficient always to conclude that
$$I(X;Y|Z) \le I(X;Y)? $$
For Markov chain, we know it is always valid but is the Markov chain both necessary and sufficient for that to be valid?
Note that by chain rule \begin{align} I(X;Y) = I(X;Y\mid Z)+I(Z;Y)-I(Z;Y|X) \end{align}
So, one has to establish the sign of $\Delta=I(Z;Y)-I(Z;Y|X)$
Otherwise, I don't think a lot can be done unless we know more about the joint distribution of $(Z,Y,X)$.