Definition of conditional mutual information

117 Views Asked by At

The definition of conditional mutual information I've read is $$ I(X;Y|Z) = H(X|Z)-H(X|Y,Z) $$ where H is entropy.

As I am not sure where the X|Z comes from, I was wondering if conditioning Y on Z meant that we also condition X on Z. In which case, we could write:$$ I(X;Y|Z) =I(X|Z;Y|Z) $$

1

There are 1 best solutions below

1
On

You should read $;$ as binding more strongly than $|$. In other words $X;Y|Z = (X;Y)|Z=(X|Z) ;(Y|Z) $.

Unfortunately information theory notation is not always so clear about things like this.