Does mutual dependency implies $I(X;Y|Z) \le I(X;Y)$?

84 Views Asked by At

Let $X,Y$ and $Z$ are three random variables. We know that conditional mutual information might be greater than or less than the unconditional mutual-information i-e $$I(X;Y|Z) \ge I(X;Y) $$ $$I(X;Y|Z) \le I(X;Y). $$

Question: if $Y$ and $Z$ are mutually dependent and/or similarly $X$ and $Z$ are mutually dependent too, would that be sufficient always to conclude that

$$I(X;Y|Z) \le I(X;Y)? $$

For Markov chain, we know it is always valid but is the Markov chain both necessary and sufficient for that to be valid?

1

There are 1 best solutions below

0
On

Note that by chain rule \begin{align} I(X;Y) = I(X;Y\mid Z)+I(Z;Y)-I(Z;Y|X) \end{align}

So, one has to establish the sign of $\Delta=I(Z;Y)-I(Z;Y|X)$

  1. If $(Z,Y)$ are independent then $\Delta \le 0$
  2. If (Z,Y) are conditionally inpendent given $X$ then $\Delta \ge 0$.

Otherwise, I don't think a lot can be done unless we know more about the joint distribution of $(Z,Y,X)$.