To show Convergence in probability

115 Views Asked by At

Suppose that $X_1,X_2,\dots$ are independent with $P(0\le X_j\le 1)=1$ for all $j$. Let $S_n=\sum_{j=1}^nX_j$ and $\mu_n=ES_n$. Show that if $\mu_n\to\infty$ as $n\to\infty$, then $S_n\overset{p}{\to}\infty$.

Does the following justification suffice to prove this? Convergence in expectation implies convergence in probability.

If not, can anyone help me in filling the details for this proof? Thanks in advance!

1

There are 1 best solutions below

0
On BEST ANSWER

This result can be obtained as an immediate consequence of Kolmogorov's 0-1 law and Kolmogorov's Three Series Theorem. The event $\{\omega: \sum X_i \text {converges}\}$ is a tail event, so its probability is $0$ or $1$. We will show that the probability is $0$. (Since $S_n$ is increasing it would follow that $S_n$ tends to $\infty$ almost surely, hence in probability). If the probability is $1$ then Kolmogorov's Three Series Theorem tells you that $\sum EX_i <\infty$ which is not the case. [ See Theorem 5.3.3 of 'A course in Probability Theory' by K L Chung. Take $A=1$ in that theorem. This is where the fact that $0 \leq X_i \leq 1$ is used.