Help show a statistic converges in probability given another statistic that converges in probability

71 Views Asked by At

Let $Y = (Y_1,\dots,Y_n)$ be a random sample from $N(\mu,1)$ and $\bar{Y}=\sum\limits_{i=1}^nY_i/n$

I am given that $\bar{Y}^2$ converges in probability to $\mu^2$ and now need to show that $\bar{Y}^2-\frac{1}{n}$ also converges in probability to $\mu^2$.

Since $\bar{Y}^2$ converges in probability to $\mu^2$ means: $$ \forall \delta>0 ,\forall \epsilon>0, \exists N, \text{ such that }\\\,n>N \Rightarrow P\left(\left|\bar{Y}-\mu^2\right| \ge \delta\right)<\epsilon$$

and

$$\lim\limits_{n \rightarrow \infty} \frac{1}{n} = 0$$

It looks to me that the solution involves selecting the right $n$ so that $P\left(\left|\bar{Y}-\frac{1}{n}-\mu^2\right| \ge \delta \right) \le P\left(\left|\bar{Y}-\mu^2\right| \ge \delta \right)<\epsilon$ and I tried to show this can be done by selecting letting $0<\frac{1}{n}<\frac{\delta}{2}$, which was a common thing to do in the exercises involving $\epsilon-\delta$ proofs. But I am unable to prove this is true. In fact, I think this is not true because the interval in the LHS is smaller than the interval on the RHS, hence the probability should be $>$ instead of $\le$

Am I on the right track and is there an alternative method that is quicker?

1

There are 1 best solutions below

0
On BEST ANSWER

The sequence of random variables, $\{\bar{Y}^2_n-\frac{1}{n}\}$, converge to a constant $\mu^2$ in probability if, for each $ \epsilon>0, \delta>0$ there exists natural number $N$ such that, for all $n>N$, $$ P\left(\left\vert \bar{Y}^2_n-\frac{1}{n} - \mu^2\right\vert\geqslant\epsilon \right)<\delta\,. $$

So, choose $\epsilon > 0,\ \delta>0$ and observe that, for the following events, we have $$ \left\{\left\vert \bar{Y}^2_n-\frac{1}{n}-\mu^2\right\vert \geqslant \epsilon\right\} \subseteq \left\{\left\vert \bar{Y}^2_n-\mu^2\right\vert + \left\vert\frac{1}{n}\right\vert \geqslant \epsilon \right\}\,.\tag{1} $$

Now, there exists a natural number $n_{\epsilon}$ such that, for all $n\geqslant n_{\epsilon}$ we have $\epsilon - \left\vert\frac{1}{n}\right\vert>0$. Therefore, $(1)$ implies $$ P\left(\left\vert \bar{Y}^2_n-\frac{1}{n}-\mu^2\right\vert\geqslant\epsilon \right) \leqslant P\left(\left\vert \bar{Y}^2_n-\mu^2\right\vert + \left\vert\frac{1}{n}\right\vert\geqslant\epsilon \right) = P\left(\left\vert \bar{Y}^2_n-\mu^2\right\vert \geqslant \epsilon - \left\vert\frac{1}{n}\right\vert \right) \leqslant P\left(\left\vert \bar{Y}^2_n-\mu^2\right\vert \geqslant \epsilon - \left\vert\frac{1}{n_{\epsilon}}\right\vert \right) \,. $$

But, $\left\{\bar{Y}^2_n\right\}$ converges to $\mu^2$ in probability. Therefore, there exists $N$ such that, for all $n>N$, the last probability above is smaller than $\delta$.

Therefore, we have shown that for each $ \epsilon>0, \delta>0$ there exists natural number $N$ such that, for all $n>N>n_{\epsilon}$, $$ P\left(\left\vert \bar{Y}^2_n-\frac{1}{n} - \mu^2\right\vert\geqslant\epsilon \right)<\delta\,. $$



As an alternative proof, note that the sequence $\left\{-1/n\right\}$ converges in probability to $0$. Therefore, by Slutsky's theorem, $\left\{\bar{Y}^2_n - 1/n\right\}$ converges in probability to $\mu^2$.