Question: I have two random variables $X\sim\text{Bernoulli}(\alpha)$ and $Y\sim\mathcal{N}(0,1)$. Is it possible to compute the mutual information $I(X;X+Y)$ analytically?
My attempt: By the definition of mutual information, we have $$ I(X;X+Y)=H(X)-H(X|X+Y), $$ where $H(X)$ can be worked out analytically but I don't know how to work out $H(X|X+Y)$.
Let $Z=X+Y$. Notice that $Z$ has a density $f_Z$ correspoing to a gaussian mixture. Now $P(X=0|Z=z)= \frac{f_Z(z|X=0) P(X=0)}{f_Z(z)}=\frac{f_Y(z) (1-\alpha)}{f_Z(z)}=g(z)$
So $H(X|Z=z)= h_b(g(z))$ where $h_b$ is the binary entropy function. And
$$H(X|Z)= \int f_Z(z) h_b(g(z)) dz$$
I might be easier to compute instead $h(Z)-h(Z|X)$