Given that $X\sim\text{Bernoulli}(\nu)$ for some $\nu\in(0,1)$, and $Y\sim N(0,1)$ are independent random variables. Can the mutual information $I(X;CX+Y)$, where $C$ is some non-random constant, be simplified further? I tried breaking things up into Shannon entropy but things just got more complicated. I am mainly wondering how the value of $C$ affects the mutual information.
My attempt: \begin{align*} I(X;CX+Y) &=H(CX+Y)-H(CX+Y|X) \\ &=H(CX+Y)-H(Y) \\ &=H(CX+Y)-\frac{1}{2}\log(2\pi e), \end{align*} where at this stage, I am stuck on how to evaluate $H(CX+Y)$...
The density of $Z=CX+Y$ is a mixture of normals $N(0,1)$ and $N(C,1)$, with weights $1-\nu$,$\nu$ resp. Its differential entropy has no closed form. Of course, we know that if $C=0$ (or $\nu=0$), $h(Z)=h(Y)$. And if $|C| \gg \sigma$ we can write $h(Z) \approx H_b(\nu) + h(Y)$ where $H_b$ is the binary entropy function. Perhaps some bounds or approximations are feasible, but, in general, you will need to compute the result numerically.
Equivalently, you can write $I(X;Z)=H(X)-H(X|Z)$, so, rather trivially $0 \le I(X;Z) \le H_b(\nu)$. The true value will be close to the upper bound if $|C| \gg \sigma$, and close to zero if $|C| \ll \sigma$. Perhaps the bound in this paper is relevant.