I am trying to prove that for $X$ a centered Gaussian variable,
$$\limsup_{n\in\mathbb{N}}\,\mathbb{E}\left[(X+n)^2\log\left(\frac{(X+n)^2}{1+n^2}\right)\right]=2.$$
I already know by the Gaussian logarithmic Sobolev inequality (Concentration Inequalities, Boucheron et al, 2014, p124) that the quantity on the left is smaller than $2$, therefore I am trying to lower bound it.
The term on the left is the $\Phi$-entropy of $(X+n)^2$ for $\Phi(x)=x\log(x)$. I've found this result to be true experimentally, but had no luck proving it. I tried re-writing it for a Gaussian variable centered at $n$:
$$\frac1{2\pi} \int_{-\infty}^\infty x^2\log\left(\frac{x^2}{1+n^2}\right)\exp\left(\frac{-(x-n)^2}2\right)\,\mathrm dx.$$
I've also tried doing an integration by parts using the primitive of $x^2 \log(x^2)$ but did not get any simpler an equation... I was hoping for a $2$ to factor out but it didn't.
Take a look at Proposition 5.5.1 of the monograph Analysis and Geometry of Markov Diffusion Operators together with its follow-up discussions. In particular, the choice of $f(x) = \mathrm{e}^{ax}$ (on the real line) will give you a sharp LSI (for centered Guassian measures)