How to show that $n^{-r} \sum_{j=1}^n (X_j - \mu) \rightarrow 0$ in probability

79 Views Asked by At

I need your help to prove the following statement.


Let $X_1, \cdots X_n$ be stochasticaly independent, identically distributed random variables. Assume they have a finite expected value $\mu$ and a finite variance $\sigma^2$. Show that the following equivalence holds:

$$ n^{-r} \sum_{j=1}^n (X_j - \mu) \ \longrightarrow \ 0 \ \text{in probability}\quad \iff \quad r>\frac{1}{2} $$


Research effort

We have to consider the expression $P(|\sum_{j=1}^n(X_j-\mu)| \geq \epsilon \cdot n^r )$ We know that $(|\sum_{j=1}^n(X_j-\mu)| \geq \sum_{j=1}^n|X_j-\mu|| \geq \epsilon$ And with some other observations, it would be enough to prove that $$ P(|X_1-\mu| \geq \epsilon \cdot n^{r-1})^n \rightarrow 0 \quad \text{in probability} $$ It appears to me that I have not enough information about $X_1$' behaviour to prove this. Could you please give me a hint?

1

There are 1 best solutions below

8
On

Define $Y_n:=\frac 1{n^r}\sum_{j=1}^n(X_j-\mu)$. Since $\mathbb E(Y_n^2)=\frac 1{n^{2r-1}}\sigma^2$, there is convergence in probability when $r>1/2$.

For $r=1/2$, there is a well known theorem in probability theory which shows that we can't have convergence in probability to $0$.