Calculation of Shannon entropy given the mutual information of Binary strings

483 Views Asked by At

Suppose $A$ and $B$ two different binary strings of length $l$. Suppose the Mutual Information (https://en.wikipedia.org/wiki/Mutual_information) of $A$ and $B$ is known to be $I$.

Now suppose bit-wise a Boolean function is applied to get another bit-string $C$. The Boolean function is $f(0,0)=0, f(0,1)=0, f(1,0)=0, f(1,1)=1$.

Under this function f, what would be the Shannon entropy of the bit-string $C$? I want to know the least upper bound and greatest lower bound of the Shannon entropy.

Thanks in advance.

2

There are 2 best solutions below

0
On BEST ANSWER

The mutual information tells you nothing about the entropy of C.

The mutual information can be $l$ and the entropy of $C$ can be $0$: set $$ A(i) = 1-B(i) = \begin{cases} 0 \mbox{ with probability } 1/2, \\ 1 \mbox{ with probability } 1/2. \end{cases}$$

The mutual information can be $l$ and the entropy of $C$ can be $l$: set $$A(i) = B(i) = \begin{cases} 0 \mbox{ with probability } 1/2, \\ 1 \mbox{ with probability } 1/2. \end{cases} $$

The mutual information can be $0$ and the entropy of $C$ can be $l$: set $$ A(i) = \begin{cases} 0 \mbox{ with probability } 1 - 1/\sqrt{2},\\ 1 \mbox{ with probability } 1/\sqrt{2}, \end{cases}$$ and use the same distribution for $B(i)$, making $A$ independent of $B$.

Finally, the mutual information can be $0$ and the entropy of $C$ can be $0$: set $$A(i) = B(i) = 1.$$

And you can get any convex combination of these cases by using these same distributions on substrings of $A$ and $B$.

10
On

The greatest lower bound is 0 bits:

$A=1-B$, so $I(A,B)=H(A)=H(B)=0.5$. But with your definition of $f$, $C=0$, and hence $H(C)=0$

An upper bound is given in the comments by Peter Shor.