Chain rule for $I(X;Y,Z)=I(X;Y)+I(X;Z|Y)$

60 Views Asked by At

I know that the chain rule for the mutual information $I(X_1,...,X_n;Y)$ is defined as:

$$I(X_1,...,X_n;Y) = I(X_1;Y) + I(X_2;Y|X_1) + ... + I(X_n;Y|X_1,...,X_{n-1})$$

But in the lecture notes of a subject of my degree, it says that:

enter image description here

And I do not really see how they get there. They are describing the mutual information for a Markov chain with $X$, $Y$ and $Z$.

1

There are 1 best solutions below

0
On

$X-Y-Z$ are a Markov chain iff $I(X;Z|Y)=0$. Both descomposition are possible via chain rule (and mutual information is always greater or equal than zero).