Expectation of $1/||X||^2$ under $X \sim N(\theta, I)$ goes to 0 as $||\theta||$ goes to infinity

174 Views Asked by At

Let $p \geq 3$ and $X \sim N(\theta, I_p)$. I want to show that

$$\mathbb{E}_\theta\bigg(\frac{1}{||X||^2}\bigg) \rightarrow 0 \textrm{ as } ||\theta|| \rightarrow \infty.$$

I have the following proof, but it is not very straightforward, so I am wondering if there is a simple argument.

  1. First note that $\mathbb{E}_\theta\big(\frac{1}{||X||^2}\big)$ depends only on $||\theta||$ by spherical symmetry. Let $g: [0, \infty) \rightarrow [0, \infty]$ be such that $g(||\theta||) = \mathbb{E}_\theta\big(\frac{1}{||X||^2}\big).$ Note also that it is clear that $g$ is non-increasing.

  2. Define the James-Stein estimator $\delta^{JS}(X) = \big(1 - \frac{p-2}{||X||^2}\big)X$. It can be shown that

$$\mathbb{E}_\theta\Big([\delta^{JS}(X) - \theta]^2\Big) = p - (p-2)^2\mathbb{E}_\theta\bigg(\frac{1}{||X||^2}\bigg).$$ Since this is $\geq 0$, we must have that $\mathbb{E}_\theta\big(\frac{1}{||X||^2}\big)$ is finite for all $\theta \in \mathbb{R}^p$, so $g$ is finite.

  1. Let $S$ be the unit sphere in $\mathbb{R}^p$, and $E = \mathbb{R}^p \setminus S$. Then $g(r) = g_S(r) + g_E(r)$ where $g_S(r) = \mathbb{E}_\theta\big(\frac{1}{||X||^2} 1_{X\in S}\big)$ and $g_E(r) = \mathbb{E}_\theta\big(\frac{1}{||X||^2} 1_{X\in E}\big)$ where $||\theta|| = r$.

  2. It is not hard to show that $g_E(r) \rightarrow 0$ as $r \rightarrow 0$. Now, given arbitrary $\epsilon > 0$, we can pick $r \geq 1$ large enough that $f(x | r\mathbf{e}_1) \leq \epsilon f(x | \mathbf{e}_1)$ for all $x\in S$, where $f(x | \theta)$ is the p.d.f. of the $N(\theta, I_p)$ distribution and $\mathbf{e}_1 = (1, 0, ..., 0)$. Then $g_S(r) \leq \epsilon g_S(1)$. Since $g_S(1)$ is finite, we have that $\inf_{r \geq 0} g_S(r) = 0$.

Thus, $\inf_{r\geq 0} g(r) = 0$. Since $g$ is non-increasing, its limit as $r \to \infty$ must be 0.

Note: this result shows in particular that, although the James-Stein estimator dominates the estimator $X$ under quadratic loss, they have the same maximal risk.

1

There are 1 best solutions below

0
On

I don't know if this is fundamentally simpler, but it looks superficially different.

The quantity $R=\|X\|^2$ has a non-central chi squared distribution with $p$ degrees of freedom and non-centrality parameter $\lambda=\|\theta\|^2$. We can represent $R=S+T$ where $S\ge0$ and $T\ge0$ are independent, $S$ has an ordinary chi-squared distribution on $p$ degrees of freedom, and $T$ is a mixture of ordinary chi-square rvs with $2k$ degrees of freedom, where $k$ is Poisson distributed with parameter $\lambda/2$. ($T$ is sometimes called a non-central chi-squared on zero degrees of freedom; see the wikipedia page cited above for details.)

We will use the dominated convergence theorem, with dominating integrand $1/S$, for which $1/R\le 1/S$. Note that $E(1/S)<\infty$ because of the shape of the density function of $S$: you are integrating $\int_0^\infty s^{p/2-1} \exp(-s/2) (1/s)\,ds$, which is finite if $p\ge3$.

Let $\lambda_n\to\infty$ and let $T_n$ be a random variable with non-centralality parameter $\lambda_n$ and zero degrees of freedom. We want to see if $E(1/(S+T_n))\to0$. Note that $1/T_n$ converges in distribution to $0$ as $n\to\infty$. By the Skorokhod theorem, we can pretend $1/T_n$ converges to $0$ with probability $1$. So now we use the DCT. We have $1/(S+T_n)\le 1/S$ with probability $1$, we have $\lim_{n\to\infty}1/(S+T_n)=0$ with probability $1$, and we have $E(1/S)\lt\infty$. So the DCT tells us $E(1/(S+T_n))\to0$ as desired.

Another way to prove the result is to notice that this chi-square with zero degrees of freedom stuff implies the family of distributions of $1/\|X\|^2$ is uniformly integrable. Then passage to the limit under the integral sign is justified.