Proof $I(X;X|Z)= H(X|Z)$

31 Views Asked by At

$I(X;X|Z)= H(X|Z)$

Proof

$I(X;X|Z) = \sum_{x,z}p(x,z) \log \frac{p(x,x|z)}{p(x|z)p(x|z)}$

since $p(x,x|z)= p(x|z)$

Then

$I(X;X|Z) = \sum_{x,z} p(x,z) \log \frac{1}{p(x|z)}$

$= - \sum_{x,z} p(x,z) \log p(x|z)= H(X|Z)$

Can anyone please help to check if my answer is correct or not. Thanks

1

There are 1 best solutions below

0
On BEST ANSWER

Looks good to me. An algebraic alternative proof :

The mutual information is a difference of entropy : $I(X;X|Z)=H(X|Z)-H(X|X,Z)$, the second term is $0$ since given $X$, $X$ is deterministic.