A question about the central limit theorem

144 Views Asked by At

The question is:
$g:R\rightarrow R$ has at least three bounded continuous derivatives and let $X_i$ be $iid$ and in $L^2$. Prove that:

i) $\sqrt{n}[g(\overline{X_n}) - g(\mu)]\xrightarrow{w} N(0,g^{'}(\mu)^{2} \sigma ^2)$ and that

ii) $E[g(\overline{X_n})-g(\mu)] = \frac{\sigma^2g''(\mu)}{2n} + o(n^{-1})$ as $n\rightarrow \infty$

where $\overline{X_n} = \frac{\sum X_n}{n}$, $\mu = EX_1$, $\sigma^2=Var(X_1)$

I have proved i) using CLT but for ii) $g(\overline{X_n}) - g(\mu)\approx N(0,g^{'}(\mu)^{2} \sigma ^2/n)$ as $n\rightarrow \infty$. Since $RHS$ has $g''(\mu)$, I was thinking of expanding $e^{\frac{-x^2}{2g'(\mu)\sigma^2/n}}$using Taylor's series at $\mu$ but it already has $g'(\mu)$ in it which is a constant. If it had a $g'(x)$, I would get a $2^{nd}$ derivative, so not sure how to approach the problem. Thanks and appreciate a hint.

2

There are 2 best solutions below

5
On BEST ANSWER

Here is a variation on the other answer. It has two parts.

First, writing the Taylor approximation as $$g(x)=g(\mu)+g'(\mu)\,(x-\mu) + g''(\mu)\,(x-\mu)^2/2 + (x-\mu)^2h(x-\mu)$$ where $h$ is continuous and bounded, and vanishes at $0$. (This should be clear from the usual Taylor expansion error bound for $x$ near $\mu$; and from the boundedness of $g$: one has $h(x)=(g(x)-(g(\mu)+g'(\mu)\,(x-\mu) + g''(\mu)\,(x-\mu)^2/2)/(x-\mu)^2$, which is clearly bounded when $x$ is far from $\mu$.)

Second, write $Z_n = \sqrt n(\overline X_n-\mu)$, so $Z_n\to Z$ in distribution, where $Z\sim N(0,1)$, by the usual CLT.

Now the error term in the original problem's (ii) is $$E\left( Z_n^2 h(Z_n/\sqrt n)\right) / n$$ and the problem boils down to showing that $\lim_{n\to\infty} E\left( Z_n^2 h(Z_n/\sqrt n) \right)= 0$.

Since the $Z_n$ converge in distribution to $Z$ there exist random variables $Y_n$ with the same distribution as $Z_n$ such that $Y_n\to Z$ with probability 1. (Such as, $Y_n = F_n^{-1}(\Phi(Z))$ where $F_n$ is the cdf of $Z_n$ and $\Phi$ is the cdf of $Z$, or by Skorohod's theorem.) Since $Y_n\to Z$ w.p.1 and $EY_n^2 = EZ^2$, the sequence $Y_n^2$ is uniformly integrable. This, plus $h$ being bounded, implies the sequence $Y_n^2 h(Y_n/\sqrt n)$ is uniformly integrable.

(For details of the uniform integability: Hewitt and Stromberg theorem 13.47, for instance implies the convergence $Y_n^2\to Z^2$ holds in $L^1$, and exercise 13.39 then gives uniform integrability of the $Y_n^2$. See also Theorem 21 in Dellacherie and Meyer, Probability and Potential, 1978 edition, p."23-11".)

So $$\lim_{n\to\infty}E(Y_n^2 h(Y_n/\sqrt n)) = E (\lim_{n\to\infty} Y_n^2 h(Y_n/\sqrt n)) = E Z^2 h(0) = E(Z^20) = 0.$$ But $Z_n$ and $Y_n$ have the same distribution, so $$\lim_{n\to\infty}E(Z_n^2 h(Z_n/\sqrt n)) = \lim_{n\to\infty}E(Y_n^2 h(Y_n/\sqrt n)) = 0,$$ as desired.

3
On

This is more a comment than an answer, but at least an idea.

Using Taylor's formula with integral remainder gives $$ g\left(\overline{X_n}\right)-g(\mu)=\left(\overline{X_n}-\mu\right)g'(\mu) +\frac{\left(\overline{X_n}-\mu\right)^2}2g''(\mu)+\frac 12\int_\mu^{\overline{X_n}}g^{(3)}(t)\left(\overline{X_n}-t\right)^2\mathrm dt. $$ Therefore, taking the expectations reduces us to show that $$ \lim_{n\to +\infty}n\mathbb E\left[\int_\mu^{\overline{X_n}}g^{(3)}(t)\left(\overline{X_n}-t\right)^2\mathrm dt\right]=0. $$