Information content of two bits

61 Views Asked by At

Let $X \in\{0,1\}$ and $Y \in\{0,1\} $ be two uniformly distributed bits. Let $B$ be an arbitrary random variable such that $I(X:B)=0$, $I(Y:B)=0$, and $I(X \oplus Y:B)=0$, then is it true that $I(X,Y:B)=0$?

($I(X:Y)$ is Shannon’s mutual information.)

1

There are 1 best solutions below

2
On

Zero mutual information can be interpreted as independence as well. The set of assumptions conclude that: \begin{align*} P((X,Y)\in\{(0,0),(0,1)\}\big| B=0)&= P((X,Y)\in\{(0,0),(0,1)\}\big| B=1)\\ P((X,Y)\in\{(1,0),(1,1)\}\big| B=0)&= P((X,Y)\in\{(1,0),(1,1)\}\big| B=1)\\ P((X,Y)\in\{(0,0),(1,0)\}\big| B=0)&= P((X,Y)\in\{(0,0),(1,0)\}\big| B=1)\\ P((X,Y)\in\{(0,1),(1,1)\}\big| B=0)&= P((X,Y)\in\{(0,1),(1,1)\}\big| B=1)\\ P((X,Y)\in\{(0,0),(1,1)\}\big| B=0)&= P((X,Y)\in\{(0,0),(1,1)\}\big| B=1)\\ P((X,Y)\in\{(0,1),(1,0)\}\big| B=0)&= P((X,Y)\in\{(0,1),(1,0)\}\big| B=1). \end{align*} Let $$U_i=[P((X,Y)=(0,0)|B=i),P((X,Y)=(0,1)|B=i),P((X,Y)=(1,0)|B=i),P((X,Y)=(1,1)|B=i)]^T.$$ Then, by removing redundant equations, we get \begin{align*} \begin{bmatrix} 1& 1&0&0\\ 0& 0&1&1\\ 0& 1&0&1\\ 0& 1&1&0 \end{bmatrix}\cdot U_0= \begin{bmatrix} 1& 1&0&0\\ 0& 0&1&1\\ 0& 1&0&1\\ 0& 1&1&0 \end{bmatrix}\cdot U_1, \end{align*} which results in $U_0=U_1$. Or in other words $$\forall b,x,y:~P((X,Y)=(x,y)\big| B=0)=P((X,Y)=(x,y)\big| B=1),$$ which concludes independence and $I(X,Y;B)=0$.