Conditional mutual information zero in case of conditional independence: $I(X;Z|Y) = 0$

719 Views Asked by At

I want to show that in case $(X,Y,Z)$ is Markovian (and therefore $X$ and $Z$ are conditionally independent), the mutual information $I(X;Z|Y) = 0$.

My thoughts: $I(X;Z|Y) = H(X|Z) - H(X|Y,Z) = H(X,Z) - H(Z) - H(X|Y,Z)$ but I can't go on any further...any ideas?

2

There are 2 best solutions below

1
On BEST ANSWER

If $(X,Y, Z)$ is Markovian, then $H(Z|Y,X)=H(Z|Y).$

And just remember that information is symmetric of its arguments, precisely: $$I(X;Y)=I(Y;X).$$

EDIT: to be more precise: $$I(X;Z|Y)=I(Z;X|Y)=H(Z|Y)-H(Z|X, Y)=H(Z|Y)-H(Z|Y)=0$$

And I see it now, that you wrongly defined $I(X;Z|Y)$, because: $$I(X;Z|Y)\neq H(X|Z)-H(X|Y,Z),$$ it is: $$I(X;Z|Y)= H(X|Y)-H(X|Y,Z).$$

And to answer why is $H(Z|X,Y)=H(Z|Y)$ look at: $$P\{X=x, Y=y, Z=z\}=P\{X=x\}\cdot P\{Y=y|X=x\}\cdot P\{Z=z|Y=y\} $$ and try to understand why is it true and how we can from that derive that $X$ and $Z$ are actually independent given the $Y$.

1
On

It's quite simple. In general

$$I(X;Z) = H(Z) - H(Z\mid X) $$

and conditioning on $Y$

$$I(X;Z \mid Y) = H(Z \mid Y) - H(Z\mid X,Y) $$

Now, if $X\to Y \to Z$ are Markov, then $H(Z\mid X,Y)=H(Z\mid Y)$

Then $$I(X;Z \mid Y)=0$$