Conditional joint entropy of two random variables

409 Views Asked by At

I am trying to prove the formula that gives the joint entropy of the random variables $X$ and $Y$ given $Z$ which is:

$$H(X,Y|Z) = H(X|Z) + H(Y|X,Z)$$

based on the definition of conditional entropy which is:

$$H(Y|X) \triangleq E_{p_{X,Y}}\{-log_2p_{Y|X}(Y|X)\}$$

So if we start from the LHS: \begin{eqnarray*} \\H(X,Y|Z) &=& E_{p_{X,Y,Z}}\{-log_2p_{X,Y|Z}(X,Y|Z)\} \\&=&-\sum\limits_{x}\sum\limits_{y}\sum\limits_{z}p_{X,Y,Z}(x,y,z)\cdot log_2p_{X,Y|Z=z}(x,y) \\&=&-\sum\limits_{x}\sum\limits_{y}\sum\limits_{z}p_{X,Y,Z}(x,y,z)\cdot log_2\{p_{Y|X=x,Z=z}(y)\cdot p_{X|Z=z}(x)\} \\&=&-\sum\limits_{x}\sum\limits_{y}\sum\limits_{z}p_{X,Y,Z}(x,y,z)\cdot \Big[log_2\{p_{Y|X=x,Z=z}(y)\} + log_2\{p_{X|Z=z}(x)\}\Big] \\&=&-\sum\limits_{x}\sum\limits_{y}\sum\limits_{z}p_{X,Y,Z}(x,y,z)\cdot log_2\{p_{Y|X=x,Z=z}(y)\} \\&&-\sum\limits_{x}\sum\limits_{y}\sum\limits_{z}p_{X,Y,Z}(x,y,z)\cdot log_2\{p_{X|Z=z}(x)\}\\&=&-\sum\limits_{x}\sum\limits_{y}\sum\limits_{z}p_{X,Y,Z}(x,y,z)\cdot log_2\{p_{Y|X=x,Z=z}(y)\} \\&&-\sum\limits_{x}\sum\limits_{z}log_2\{p_{X|Z=z}(x)\}\sum\limits_{y}p_{X,Y,Z}(x,y,z) \\&=&-\sum\limits_{x}\sum\limits_{y}\sum\limits_{z}p_{X,Y,Z}(x,y,z)\cdot log_2\{p_{Y|X=x,Z=z}(y)\} \\&&-\sum\limits_{x}\sum\limits_{z}log_2\{p_{X|Z=z}(x)\}p_{X,Z}(x,z) \\&=&H(Y|X,Z) + H(X|Z) \end{eqnarray*}