Given $\sigma>0$, let $X\sim N(0,\sigma^2I_d)$ be a normal random variable in $\mathbb R^d$. Prove or disprove: there exists a constant $C$ such that for any 1-Lipschitz function $f:\mathbb R^d\rightarrow\mathbb R$, $$ \log\big(\mathbb E[e^{f(X)}]\big) - \mathbb E[f(X)] \leqslant C\sigma^2. $$ Here, $f$ is 1-Lipschitz means that $f$ is continuous in $\mathbb R^d$ and $|f(x)-f(y)|\leqslant |x-y|$ for any $x,y\in\mathbb R^d$.
This problem comes from the proof of Proposition 3 in this paper, where the authors claimed that the log-Sobolev inequality can be used to prove eq. 25. The problem above can be viewed as a simplified version of eq. 25 of the paper. However, I found it non-trivial to obtain eq. 25 using the log-Sobolev inequality, the log-Harnack inequality, or other functional inequalities I know. It looks more like an inverse type of the Jensen's inequality, which is the reason this problem is so named.
Here are some direct observations of this problem. First of all, by Jensen's inequality, one has $$ \log\big(\mathbb E[e^{f(X)}]\big) - \mathbb E[f(X)] \geqslant 0, $$ which seems no help.
Another idea is to consider the random variable $Y = f(X)$ rather than $X$ itself. Then it may be reasonable to consider $$ \log\big( \mathbb E[e^Y]\big) - \mathbb E[Y] \leqslant \mathrm{Var}(Y). $$ Unfortunately, this inequality does not hold for arbitrary $Y$. Also, the Lipschitz property of $f$ becomes implicit here.
In the case $d=1$, if one chooses $f(x) = x$ directly, it can be verified that $$ \log\big(\mathbb E[e^{X}]\big) - \mathbb E[X] = \frac{\sigma^2}2. $$ This is why the RHS has a square term of $\sigma$.
Any suggestions are appreciated. Even the solution in the case $d=1$ is fine.
Partial answer; I show the inequality holds for all $d\ge 1$ as long as $\sigma \ge 3$. Moreover, this goes through a bound that is independent of the dimension $d$, which may itself be of interest.
The inequality amounts to an upper bound on $\mathbb{E}[e^{f(X)}]$, which we proceed to obtain as follows: $$\mathbb{E}[e^{f(X)}] = \int_{0}^{\infty} \mathbb{P}(e^{f(X)} \ge t) \, dt = \int_{0}^{\exp(\mathbb{E}f(X))} \mathbb{P}(e^{f(X)} \ge t) \, dt + \int_{\exp(\mathbb{E}f(X))}^{\infty} \mathbb{P}(e^{f(X)} \ge t) \, dt $$ $$\le e^{\mathbb{E}f(X)} + \int_{\exp(\mathbb{E}f(X))}^{\infty} \mathbb{P}(e^{f(X)} \ge t) \, dt $$ Making the change of variables $t = \exp(\mathbb{E}f(X) + s)$, the latter integral is $$\int_{\exp(\mathbb{E}f(X))}^{\infty} \mathbb{P}(e^{f(X)} \ge t) \, dt = e^{\mathbb{E}f(X)} \int_{0}^{\infty} \mathbb{P}(f(X) - \mathbb{E}f(X) \ge s) e^{s} \, ds$$ Using now the Gaussian concentration inequality for $L$-Lipschitz functions (Theorem 5.6 in Boucheron's Concentration inequalities: a nonasymptotic theory of independence), we know that $\mathbb{P}(f(X) - \mathbb{E}f(X) \ge s) \le 2\exp(-t^2/(2\sigma^2))$ and therefore $$\int_{0}^{\infty} \mathbb{P}(f(X) - \mathbb{E}f(X) \ge s) e^{s} \, ds \le 2\int_{0}^{\infty} \exp(s - s^2/(2\sigma^2)) \, ds $$ $$= \sqrt{2\pi\sigma^2} e^{\sigma^2/2} \left(\text{erf}\left(\sqrt{\sigma^2 / 2}\right) + 1 \right) \le C \sigma \exp(\sigma^2 / 2)$$ for absolute constant $C = 2\sqrt{2\pi}$. Putting it together, we have $$\mathbb{E}[e^{f(X)}] \le \left(1 + C\sigma \exp(\sigma^2 / 2) \right) e^{\mathbb{E}f(X)}$$ independently of the dimension $d$. Finally, $1 + C\sigma \exp(\sigma^2 / 2) \le \exp(d\sigma^2)$ for all $d\ge 1$ when $\sigma \ge 3$.