Mutual information conditioned on Z = z

131 Views Asked by At

I understand that the Mutual information I(X : Y ) can be calculated as follows: mutual information While the Conditional Mutual information I(X; Y |Z) can be calculated as follows: conditional mutual information

If I want to calculate the mutual information between X and Y conditioned on Z = z which can be denoted as I(X : Y |Z=z) how can I do that ? and how this can be derived from the equations before ? is there any reference for that ?

2

There are 2 best solutions below

1
On BEST ANSWER

To compute the mutual information (or joint entropy... or whatever) conditioned on a particular value of $Z$, you simply replace all the probabilities functions by the corresponding conditional probabilities.

Say, you know (among other equivalent formulas) that

$$I(X;Y)=H(X) - H(X\mid Y)=\\ E_{X,Y}[\log(\frac{p(X,Y)}{p(X)P(Y)})]$$

Then

$$I(X;Y\mid Z=z)=H(X\mid Z=z) - H(X\mid Y, Z=z)=\\ E_{X,Y\mid Z=z}[\log(\frac{p(X,Y \mid Z=z)}{p(X\mid Z=z)P(Y\mid Z=z)})]$$

1
On

Take the second formula, and look at the summation on $Z$ (the outermost): that sum has a single element, obtained assigning the single value you want to $Z$.

Let me add that the notation you are using, is very confusing combined with that of the formulas you include (is Z a variable? A set? etc.)