Suppose I fix a quadratic function $q:\mathbb{R}^d \to \mathbb{R}$ given as
\begin{align} q(x_1,\cdots, x_d) = \frac{1}{2} \sum_{i=1}^d a_i x_i^2 + \sum_{i=1}^d b_i x_i. \end{align}
I allow $a_i, b_i$ to be both positive and negative (so the `Gaussian' in the title should be interpreted somewhat loosely).
I now define a family of probability measures $\{\mu_r\}_{r > 0}$ by the density
$$\mu_r(x) = \frac{1}{Z(\mathbf{a,b}, r)} \exp(-q(\mathbf{x})) \cdot \mathbf{I}[\| x\|_2 \leqslant r]$$
where $Z$ is the appropriate normalising constant.
More simply, I look at the $L^2$ ball around the origin with radius $r$ and charge it with density $\propto \exp(-q)$.
Let $X_r \sim \mu_r$. I'm fairly sure that as $r \to 0$,
\begin{align} \mu_r &\triangleq \mathbf{E}[X_r] &= \mathcal{O}(r) \\ \Sigma_r &\triangleq \mathbf{Cov}[X_r] &= \mathcal{O}(r^2) \end{align}
and moreover, that one can Taylor expand these expressions around $r = 0^+$, i.e.
\begin{align} \mu_r &= \sum_{k \geq 1} m^{(k)}(a,b) r^k \\ \Sigma_r &= \sum_{k \geq 2} S^{(k)}(a,b) r^k \end{align}
What I would like is to get $m^{(1)}(a,b)$ and $S^{(2)}(a,b)$ in relatively explicit form. Higher-order terms would also be of interest, but the leading terms are my main goal.
Change variables, writing $x = rz$, and expressing your moments as ratios of integrals over the set $S=\{z:\|z\|_2\le 1\} $. After your change of variables, $q(x)$ becomes $r^2 q_2(z) + rq_1(z)$, where the $q_i$ are the pure quadratic and and pure linear parts of $q$. You will also pick up powers of $r$ coming from the Jacobian. Now your integrand has a Taylor expansion in $r$, and the desired asymptotics can be read off by standard techniques. Since the range of integration $S$ is compact, you can derive error estimates and concoct a dominated convergence theorem justification for the calculation.
In summary: change variables, expand the integrand in powers of $r$, discard all but the first few terms, integrate the remaining few terms.