Suppose $A$ and $B$ two different binary strings of length $l$. Suppose the Mutual Information (https://en.wikipedia.org/wiki/Mutual_information) of $A$ and $B$ is known to be $I$.
Now suppose bit-wise a Boolean function is applied to get another bit-string $C$. The Boolean function is $f(0,0)=0, f(0,1)=0, f(1,0)=0, f(1,1)=1$.
Under this function f, what would be the Shannon entropy of the bit-string $C$? I want to know the least upper bound and greatest lower bound of the Shannon entropy.
Thanks in advance.
The mutual information tells you nothing about the entropy of C.
The mutual information can be $l$ and the entropy of $C$ can be $0$: set $$ A(i) = 1-B(i) = \begin{cases} 0 \mbox{ with probability } 1/2, \\ 1 \mbox{ with probability } 1/2. \end{cases}$$
The mutual information can be $l$ and the entropy of $C$ can be $l$: set $$A(i) = B(i) = \begin{cases} 0 \mbox{ with probability } 1/2, \\ 1 \mbox{ with probability } 1/2. \end{cases} $$
The mutual information can be $0$ and the entropy of $C$ can be $l$: set $$ A(i) = \begin{cases} 0 \mbox{ with probability } 1 - 1/\sqrt{2},\\ 1 \mbox{ with probability } 1/\sqrt{2}, \end{cases}$$ and use the same distribution for $B(i)$, making $A$ independent of $B$.
Finally, the mutual information can be $0$ and the entropy of $C$ can be $0$: set $$A(i) = B(i) = 1.$$
And you can get any convex combination of these cases by using these same distributions on substrings of $A$ and $B$.