$I(X;Z | Y ) = I(Z; Y | X) − I(Z; Y ) + I(X;Z)$

83 Views Asked by At

I am working on the following exercise from information theory:

Let $X, Y, Z : \Omega \rightarrow \mathbb{R}$ be random variables. Show that $$I(X;Z | Y ) = I(Z; Y | X) − I(Z; Y ) + I(X;Z).$$

Remark: $H$ means entropy and $I$ means mutual information. Mutual information for two discrete RVs $X, Y$is given by,

$$I(X;Y) := \sum_{x,y} P_{XY}(x,y) \log {P_{XY}(x,y) \over P_X(x) P_Y(y)} = E_{P_{XY}} \log{P_{XY} \over P_X P_Y} \, .$$ , where $P_{XY}(x,y)$ is their mutual probability distribution. The following identities were proven in class:

$$I(X;X) = H(X)$$

\begin{align} I(X;Y) &= H(X)+H(Y)-H(X,Y) \\ &= H(X)-H(X \mid Y) \\ &= H(Y)-H(Y \mid X) \\ &= I(Y, X) \end{align}

$$I(X;Y \mid Z) = H(X \mid Z) - H(X \mid Y,Z) = H(X \mid Y,Z) = H(Y \mid Z) - H(Y \mid X,Z)$$

I do not see how I could prove this. I tried to replace the $I$'s (mutual information) with $H$'s (entropy) with formulas like

$$I(X; Z \mid Y) = H(X \mid Y) - H(X \mid Z,Y)$$

but it just does not work out. Could you help me?

1

There are 1 best solutions below

2
On BEST ANSWER

Use $I(Z:Y) +I(Z:X|Y) = I(Z:XY) = I(X:Z) + I(Z:Y|X)$