Problem with calculating the mutual information

62 Views Asked by At

Knowing that $Z = f(X)$ (i.e. $Z$ and $X$ have a deterministic relation), how can we calculate the quantity

$I (Y ;X,Z)$?

I come up with the following relation but then don't know how to proceed any further.

$I (Y ;X,Z) = I(Y;X) + I(Y; Z|X)$.

I know that $H(Z|X)=0$, but I don't know what it means for the above relation.

1

There are 1 best solutions below

2
On BEST ANSWER

If $Z=f(X)$ is a one-to-one (Injective) function of $X$, then $H(Z|X)=0$ as you mentioned above. In other words, knowing $X$ makes us capable of knowing $Z$ precisely. For example if $Z=X^2$ then knowing $X\in\{-1,1\}$ gives $Z=1$ and thus $H(Z|X)=0$, but knowing $Z=1$ means $X\in\{-1,1\}$ and thus $H(X|Z)\neq 0$.

Now we use the conditioning expansion as you mentioned above

$$I (Y ;X,Z) = I(Y;X) + I(Y; Z|X)$$

Also $I(Y; Z|X)=H(Z|X)-H(Z|Y,X)=0-0=0$ (definition of mutual information). Thus $I (Y ;X,Z) = I(Y;X)$, in other words, $Z$ is redundant data, useless.

About the mutual information, we have $I(Y;Z) = H(Z)-H(Z|Y)=H(Y)-H(Y|Z)$. For the conditional case the definition becomes $$I(Y;Z|X) = H(Z|X)-H(Z|Y,X)=H(Y|X)-H(Y|Z,X)$$