Lower bound on the $\Phi$-entropy of a Gaussian variable

113 Views Asked by At

I am trying to prove that for $X$ a centered Gaussian variable,

$$\limsup_{n\in\mathbb{N}}\,\mathbb{E}\left[(X+n)^2\log\left(\frac{(X+n)^2}{1+n^2}\right)\right]=2.$$

I already know by the Gaussian logarithmic Sobolev inequality (Concentration Inequalities, Boucheron et al, 2014, p124) that the quantity on the left is smaller than $2$, therefore I am trying to lower bound it.

The term on the left is the $\Phi$-entropy of $(X+n)^2$ for $\Phi(x)=x\log(x)$. I've found this result to be true experimentally, but had no luck proving it. I tried re-writing it for a Gaussian variable centered at $n$:

$$\frac1{2\pi} \int_{-\infty}^\infty x^2\log\left(\frac{x^2}{1+n^2}\right)\exp\left(\frac{-(x-n)^2}2\right)\,\mathrm dx.$$

I've also tried doing an integration by parts using the primitive of $x^2 \log(x^2)$ but did not get any simpler an equation... I was hoping for a $2$ to factor out but it didn't.

2

There are 2 best solutions below

0
On BEST ANSWER

Take a look at Proposition 5.5.1 of the monograph Analysis and Geometry of Markov Diffusion Operators together with its follow-up discussions. In particular, the choice of $f(x) = \mathrm{e}^{ax}$ (on the real line) will give you a sharp LSI (for centered Guassian measures)

1
On

I'm going to use Fei Cao's suggestion to prove the initial question. Take $f: x \mapsto e^{\lambda x}$ for some $\lambda>0$.

We have, $$ Ent(f^2) = \mathbb{E}\left( e^{2\lambda X} 2 \lambda X \right) - \mathbb{E}\left( e^{2\lambda X} \right) \log\left( \left( e^{2\lambda X} \right) \right) = 2\lambda e^{\Psi_X(2\lambda)} \Psi_X'(2\lambda) - e^{\Psi_X(2\lambda)} \Psi_X(2\lambda) $$

On the other hand, $2\mathbb{E}(f'^2) = 2\lambda^2 e^{\Psi_X(2\lambda)}$. By dividing both sides by $e^{\Psi_X(2\lambda)}$, we are left to compare, $2\lambda \Psi_X'(2\lambda) - \Psi_X(2\lambda)$ with $2\lambda^2$

We know that $X$ is sub-gaussian, therefore $\Psi_X(2\lambda) \leq 2\lambda^2$. It is therefore enough to show that $\Psi_X'(2\lambda) \sim_{\lambda \to 0} 2\lambda$.

$$ \Psi_X'(2\lambda) = \sum_k \frac{(2\lambda)^k}{k!} \mathbb{E}(X^k) \sim_{\lambda \to 0} 2\lambda $$

Therefore by taking $\lambda \to 0$ , the constant is tight.