Let $x_1$ and $x_2$ be two independent random variables with the same probability distribution $p(x)=p(x_1)=p(x_2)$, and let $z$ be a normal random variable which is independent of $x_1$ and $x_2$. Consider the following problem: $$\tag{1} \label{1} \max_{p(x):\ \ \mathbb{E}[x^2]\le 1} I(x_1;x_1+x_2+z), $$ where $I(\cdot;\cdot)$ denote the mutual information between two random variables.
My question. Is it true that the optimal probability distribution $p(x)$ in \eqref{1} is Gaussian?
If we consider $I(x_1;x_1+z)$ instead of $I(x_1;x_1+x_2+z)$, then it is easy to show (using a maximum entropy argument) that the optimal distribution must be Gaussian. However it is not clear if a similar argument applies to my case. Any help is very appreciated. Thank you.
I don't have a very illuminating answer or anything, just numerical evidence that the Gaussian is not the maximiser above.
I'll work with natural logs for convenience.
Note that, by independence, $I(X_1; X_1 + X_2 + Z) = h(X_1 + X_2 + Z) - h(X_2 + Z).$
If $X$ is Gaussian of variance $\sigma^2$, then we can explicitly compute functional above to be $1/2 \log (2 - 1/(1+\sigma^2))$ This is increasing with $\sigma$, and under the constraints, the best any Gaussian can do is $1/2 \log (3/2) \le 0.20274.$
On the other hand, consider $X $ uniform on $\{+ 1, -1\}$.
Let $\varphi$ be the density of the standard Gaussian. Note then that the density of $X + Z$ is $(\varphi(u-1) + \varphi(u+1))/2$ and the density of $X_1 + X_2 + Z$ is $(\varphi(u-2) + \varphi(u+2) + 2\varphi(u))/4.$
At this point, I simply enter these expressions into wolfram alpha. By these computations,
Thus, for this $X$, the functional is $\ge 0.204.$
(In case you want to check the above - W|A helpfully displays the latex-ed out versions of the expressions keyed in).