Assume that $\rho$ is a probability density such that $$ \rho(x) \propto \exp \left(-\frac{|x|^2}{2} - f(x) \right). $$ This means $$ \rho(x) = \frac{\exp \left(-\frac{|x|^2}{2} - f(x) \right)}{\int_{\mathbb R^n} \exp \left(-\frac{|y|^2}{2} - f(y) \right) \, \mathrm d y}. $$ Here $f$ is a smooth function such that $f(0) = \nabla f(0) = 0$ and ${\rm Hess} f(x) \geq 0$ for all $x \in \mathbb R^n$. Is there a standard result asserting that the covariance matrix of $\rho$ is bounded from above, using the usual order on positive semidefinite matrices, by the identity matrix, i.e. that the presence of the function $f$ can only reduce the covariance compared to the case where $f = 0$? I have a proof for this in dimension one (see below), but I am interested in a proof for the multidimensional case.
A natural way of proceeding would be to find a bound on $$ \int_{\mathbb R^n} xx^T \, \rho(x) \, \mathrm d x, $$ which is an upper bound for the covariance matrix. This is the approach I take below, in dimension $n = 1$.
Proof in the case $n = 1$. First note that $$ \mathrm{Cov} (\rho) = \mathbb E_{X \sim \rho}(X^2) - |\mathbb E_{X \sim \rho}(X)|^2 \leq \mathbb E_{X \sim \rho}(X^2). $$ Since $$ \mathbb E_{X \sim \rho} (X^2) = \int_{0}^{\infty} \mathbb P_{X \sim \rho} (X^2 \geq y) \, \mathrm d y, $$ it is sufficient to show $\mathbb P_{X \sim \rho} (X^2 \geq y) \leq \mathbb P_{X \sim g} (X^2 \geq y)$ for all $y \geq 0$, where $g$ denotes the density of the standard normal distribution. Showing this is equivalent, in view of the fact that the function $r \mapsto \frac{r}{1-r}$ is increasing for $y \in [0, 1]$, to proving the following: $$ \forall y > 0, \qquad \frac{\mathbb P_{X \sim \rho} (X^2 \geq y)}{\mathbb P_{X \sim \rho} (X^2 \leq y)} \leq \frac{\mathbb P_{X \sim g} (X^2 \geq y)}{\mathbb P_{X \sim g} (X^2 \leq y)}. $$ But this is obvious because, introducing the function $u(y) = \exp \left( - f\left(y\right) \right) + \exp \left( - f\left(-y\right) \right)$, which is positive and nonincreasing over $[0, \infty)$, we have \begin{align*} \frac{\mathbb P_{X \sim \rho} (X^2 \geq y)}{\mathbb P_{X \sim \rho} (X^2 \leq y)} &= \frac{\int_{y}^{\infty} \exp \left( - \frac{x^2}{2} \right) u(x) \mathrm d x} {\int_{0}^{y} \exp \left( - \frac{x^2}{2} \right) u(x) \mathrm d x} \leq \left( \frac{\max_{\{x^2 \geq y\}} u(x)} {\min_{\{x^2 \leq y\}} u(x)} \right) \frac{\int_{\{x^2 \geq y\}} \exp \left( - \frac{x^2}{2} \right) \mathrm d x} {\int_{\{x^2 \leq y\} } \exp \left( - \frac{x^2}{2} \right) \mathrm d x}, \end{align*} and, since $f$ is smooth, the minimum and the maximum coincide.
Ideas for the multi-dimensional case. In the multi-dimensional case, the approach above can be applied, by passing to radial coordinates, to obtain a bound of the form $$ \mathrm{Trace}(\mathrm{Cov} (\rho)) = \int_{\mathbb R^n} |x|^2 \, \rho(x) \, \mathrm d x \leq n, $$ where $n$ is the dimension. However, this gives only that $$ \mathrm{Cov} (\rho) \leq n I, $$ i.e. there is an undesired factor $n$ on the right-hand side. Here $I$ is the identity matrix
Note. In Bayesian terms, the function $e^{-x^2/2}/\sqrt{2\pi}$ can be viewed as a prior distribution, the function $e^{-f}$ as the likelihood and $\rho$ as the posterior distribution. The question can then be formulated as follows, approximately: is the posterior covariance bounded from above by the prior covariance given that the log-likelihood is convex.
I believe this follows from the Poincare inequality. See for example Theorem 3.1 in https://arxiv.org/pdf/1202.1510.pdf
For a $\rho$-convex function $V$, let $\mu$ be the distribution $\frac{1}{Z}e^{-V}$. For all smooth function $f$, we have \begin{equation} \int \bigg(f - \int f d\mu\bigg)^2 d\mu \leq \frac{1}{\rho} \int |\nabla f|^2 d\mu . \end{equation}
Take $f$ to be a linear function and $\rho = 1$ would give you the desired result.