Mutual information and additivity under independence?

1.2k Views Asked by At

So I've been trying to figure this out since I saw it quoted in a paper. Suppose $y$ and $z$ are two independent variables. Is it true then that $I(x;y) + I(x;z) \leq I(x:y,z)$? My intuition is that since $y $ and $z$ are independent, the information gain from taking them singly (in sequence) or together is the same.

1

There are 1 best solutions below

9
On

The proof is straightforward from computation. $$\begin{align}&I(x;y)+I(x;z)-I(x;y,z)\\ =&H(y)-H(y|x)+H(z)-H(z|x)-(H(y,z)-H(y,z|x))\\ =&H(y)+H(z)-H(y,z)+H(y,z|x)-H(y|x)-H(z|x)\\ =&I(y;z)-I(y;z|x)\\ \leq&0\end{align}$$ The last inequality comes from the independence of $y$ and $z$ which results $I(y;z)=0$. Intuitively thinking, though $y$ and $z$ are independent, if $x$ is a function of $y,z$, we may indeed obtain connections between them. For example, suppose $x,y,z$ are binary random variables and $x=y+z$ and given $x=1$, we can find connection between $y,z$, that is $y=1+z$.