Mutual information between two Bernoulli variables

312 Views Asked by At

Can we say anything about the mutual information between two Bernoulli variables, $X \sim Bern(p_1)$ and $Y \sim Bern(p_2)$?

And what if $p_1 =1$?

I went as far as $$I(X;Y) = H(X)-H(X|Y)=-H(X|Y)$$.

Can we do anything more? What if we assume that Y is dependent on X?

1

There are 1 best solutions below

4
On

The formula you are using for mutual information is slightly wrong (wrong sign), actually $I(X;Y) = H(X) - H(X | Y)$.

If the distribution followed by $X$ is Bern($p_1$), where $p_1 = 1$, then $H(X) = 0$. Consequently, $I(X; Y) = H(X) - H(X | Y) \leq H(X) = 0$.

Edit:

Also, $I(X; Y) \geq 0$ (mutual information can't take on negative values) which, combined with the result above implies $I(X;Y) = 0$.