Convergence in Distribution implies convergences almost surely

1.3k Views Asked by At

Suppose we have a sequence of non decreasing random variables that converges to 0 in distribution. How can we prove that this sequence converges to 0 almost surely?

1

There are 1 best solutions below

0
On

I will use two standard theorems :

(1) Convergence in Distribution to a constant r.v. $\Rightarrow$ Convergence in Probability

(2) If $X_n$ converges in probability to $X$ then there exists a subsequence $X_{n_k}$ such that $X_{n_k}$ converges almost surely to $X$ as $k \rightarrow \infty$.

For the given problem, $X_n(w)$ is a bounded non-decreasing sequence, therefore, $\lim_{n\rightarrow \infty}X_n(w)$ exists for all $w$. Let $n_k$ be the subsequence from (2) above. Let $T = \mathrm{P}[w : X_{n_k}(w) \rightarrow 0]$. Then $X_n(w) \rightarrow 0$ for all $w \in T$. From (2) we know $P(T) = 1$ and we are done.