Convergence in probability and boundness in probability with respect to sample mean and sample variance

45 Views Asked by At

This is a question about the convergence in probability and boundness in probability.

Suppose $X_i \overset{\textrm{i.i.d.}}{\sim} (\mu, \sigma^2 )$ for $i=1,2, \cdots, n$.

Denote $\overline{X}$ and $\hat{{\sigma^{2}}}$ as the sample mean and the sample variance respectively.

Then prove the following holds for every $x \in \mathbb{R}$.

$$\frac{x-\overline{X}}{\hat{\sigma^2}}-\frac{x-\mu}{\sigma^2} = O_p (n^{-1/2}) $$

Central limit theorem, Strong law of large numbers, Slutsky’s theorem are available.

Thank you.

P.S. In fact, I have no idea how to control with the sample variance since it is in the demoninator. It is hard to apply central limit theorem with the sample variance since the 4-th moment might not exist.

1

There are 1 best solutions below

1
On BEST ANSWER

Fix $x\in \mathbb R$. By replacing $Y_i$ with $X_i-x$ you can assume WLOG that $x=0$.

You need to show that $$\sqrt n \frac{(\overline X \sigma^2 - \mu S^2)}{S^2 \sigma^2} = O_P(1).$$

Since $S^2$ converges almost surely to $\sigma^2>0$, we have $\frac 1{S^2} = O_P(1)$.

It only remains to deal with the numerator which rewrites as $$\sqrt n (\overline X -\mu)\sigma^2 - \sqrt n (S^2-\sigma^2)\mu.$$ The first summand is $O_P(1)$ by the central limit theorem. For the second summand, assume for simplicity that $S^2 = \frac 1n \sum_i (X_i-\overline X)^2$ (i.e., without the classical $\frac 1 {n-1}$ correction that makes the sample variance unbiased) and note that $$\sqrt n (S^2-\sigma^2) = \sqrt n (\overline{X^2} - E[X_1^2]) - \sqrt n (\overline X -\mu)(\overline X +\mu).$$ The term $\sqrt n (\overline{X^2}- E[X_1^2])$ is $O_P(1)$ by the central limit theorem. By the law of large numbers, $\overline X +\mu = O_P(1)$, hence $\sqrt n (\overline X -\mu)(\overline X +\mu) = O_P(1)O_P(1) = O_P(1)$, thus the whole numerator is $O_P(1)$.