Max mutual information with variance constraint

176 Views Asked by At

Let $x_1$ and $x_2$ be two independent random variables with the same probability distribution $p(x)=p(x_1)=p(x_2)$, and let $z$ be a normal random variable which is independent of $x_1$ and $x_2$. Consider the following problem: $$\tag{1} \label{1} \max_{p(x):\ \ \mathbb{E}[x^2]\le 1} I(x_1;x_1+x_2+z), $$ where $I(\cdot;\cdot)$ denote the mutual information between two random variables.

My question. Is it true that the optimal probability distribution $p(x)$ in \eqref{1} is Gaussian?

If we consider $I(x_1;x_1+z)$ instead of $I(x_1;x_1+x_2+z)$, then it is easy to show (using a maximum entropy argument) that the optimal distribution must be Gaussian. However it is not clear if a similar argument applies to my case. Any help is very appreciated. Thank you.

1

There are 1 best solutions below

0
On

I don't have a very illuminating answer or anything, just numerical evidence that the Gaussian is not the maximiser above.


I'll work with natural logs for convenience.

Note that, by independence, $I(X_1; X_1 + X_2 + Z) = h(X_1 + X_2 + Z) - h(X_2 + Z).$

If $X$ is Gaussian of variance $\sigma^2$, then we can explicitly compute functional above to be $1/2 \log (2 - 1/(1+\sigma^2))$ This is increasing with $\sigma$, and under the constraints, the best any Gaussian can do is $1/2 \log (3/2) \le 0.20274.$

On the other hand, consider $X $ uniform on $\{+ 1, -1\}$.

Let $\varphi$ be the density of the standard Gaussian. Note then that the density of $X + Z$ is $(\varphi(u-1) + \varphi(u+1))/2$ and the density of $X_1 + X_2 + Z$ is $(\varphi(u-2) + \varphi(u+2) + 2\varphi(u))/4.$

At this point, I simply enter these expressions into wolfram alpha. By these computations,

  1. $h(X_1 + X_2 +Z) \ge 1.960$ - see https://bit.ly/2RWQuyl
  2. $h(X_2 + Z) \le 1.756$ - see https://bit.ly/2FWPo0A

Thus, for this $X$, the functional is $\ge 0.204.$

(In case you want to check the above - W|A helpfully displays the latex-ed out versions of the expressions keyed in).