Aymptotic Convergence of Mean Estimator

70 Views Asked by At

I want to show:

Let $x_{i}$ be an iid random variable with support $x_{i} \in [0,1]$.

Prove $n^{1/3}\frac{1}{n} \sum\limits_{i=1}^{n} (x_{i} - \mathbb{E}[x_{i}] ) \xrightarrow{p} 0$.

From what I am able to solve, I believe there are two ways to solve this problem:

1st way:

We say that a random variable converges in probability, ie, $x_{t},\xrightarrow{p}x$ if $P(|x_{t}-x|>\epsilon)=1.$

So to show

$\theta_{n}=n^{1/3}\frac{1}{n} \sum\limits_{t=i}^{n} (x_{i} - \mathbb{E}[x_{i}] ) \xrightarrow{p} 0$

we use the definition,

$P(|\theta_{t}-\theta|>\epsilon)=P(|n^{1/3}\frac{1}{n} \sum\limits_{t=i}^{n} (x_{i} - \mathbb{E}[x_{i}] ) -0|>\epsilon)=P(|n^{1/3}(\bar{X}-\mu) |>\epsilon)$

Using Chebyshev's Inequality

$P(|n^{1/3}(\bar{X}-\mu) |>\epsilon)\leq \frac{E[n^{1/3}(\bar{X}-\mu)^2]}{\epsilon^2}=\frac{n^{1/3}E[(\bar{X}-\mu)^2]}{\epsilon^2}=\frac{n^{1/3}\sigma^2}{n\epsilon^2}=\frac{\sigma^2}{n^{2/3}\epsilon^2}\xrightarrow{n\rightarrow\infty}0.$ QED

2nd Way: From the CLT we know

$\bar{X}=\mu+O(\frac{1}{\sqrt{n}}) \rightarrow $$ \bar{X}-\mu=O(\frac{1}{\sqrt{n}})$

$ \sqrt{n}(\bar{X}-\mu)=\sqrt{n}O(\frac{1}{\sqrt{n}})=O(1)$

which as $n\rightarrow \infty$ $O(1)\rightarrow 0$.

If instead we multiply by $n^\frac{1}{3}$

$ n^\frac{1}{3}(\bar{X}-\mu)=n^\frac{1}{3}O(\frac{1}{\sqrt{n}})=O(\frac{n^\frac{1}{3}}{\sqrt{n}})=O(\frac{1}{n^{1/6}})\xrightarrow{n\rightarrow\infty}0$. QED

My question is this:

(1) Are the two proofs I wrote valid?

(2) For the 2nd proof. I am still confused how we derived $\bar{X}-\mu=O(\frac{1}{\sqrt{n}})$. I believe it was derived using a taylor series expansion of the characteristic function, but I am not entirely certain how that derived the $O(\frac{1}{\sqrt{n}})$. Despite not knowing how to show it, I thought my logic was still correct from thereon after.

2

There are 2 best solutions below

0
On

$ \bar{X}-\mu=O(\frac{1}{\sqrt{n}})$ comes from

$\bar{X}=\mu+O(\frac{1}{\sqrt{n}})$


Define $Z_n := \sqrt{n}(\frac{X_1 + ... + X_n}{n} - \mu)$

$\to \frac{Z_n}{\sqrt{n}} = (\frac{X_1 + ... + X_n}{n} - \mu)$

$\to \frac{Z_n}{\sqrt{n}} + \mu = \frac{X_1 + ... + X_n}{n}$

$\to \frac{Z_n}{\sqrt{n}} + \mu = \bar X_n$

Actually,

$$\text{your} \ 'O(\frac{1}{\sqrt{n}})' = \frac{Z_n}{\sqrt{n}}$$

Note that as you said:

$$\sqrt{n} \ 'O(\frac{1}{\sqrt{n}})' = \sqrt{n} \frac{Z_n}{\sqrt{n}}$$

$$\ 'O(1)' = Z_n$$

$$\lim_{n \to \infty} \ 'O(1)' = \lim_{n \to \infty} Z_n$$

$$= \lim_{n \to \infty} \sqrt{n}(\frac{X_1 + ... + X_n}{n} - \mu)$$

$= 0$ by CLT

0
On

For your first proof, there was a calculation error when applying the Chebyshev's inequality: $n^{1/3}$ also needs to be squared to $n^{2/3}$, fortunately, this error doesn't affect the final result.

For your second proof, the logic is correct but there is an important conceptual mistake. Since $\bar{X}$ is random, you should write $$\sqrt{n}(\bar{X} - \mu) = O_\color{red}{P}(1)$$ instead of $O(1)$, which is used under a deterministic/nonrandom setting (such in calculus or real analysis). By the way, $O(1)$ doesn't necessarily tend to $0$ as $n \to \infty$.

The correct writing up for your second proof can be as follows: $$n^{1/3}(\bar{X} - \mu) = n^{-1/6}\times \sqrt{n}(\bar{X} - \mu) \Rightarrow 0$$ by central limit theorem and Slutsky's lemma. Since $0$ is a constant, it is also equivalent to converge in probability.