Convergence of random varaible?

99 Views Asked by At

I was reading notion of convergence in random variables and I am not sure if I understood them correctly. According to Wikipedia, for sequence of random variables $X_1,X_2,...$, almost sure convergence means $P(\lim_{n \rightarrow \infty}X_n = X)=1$ which is same as $P(\omega \in \Omega : lim_{n \rightarrow \infty} X_n(\omega) = X(\omega))= 1 $. Now as random variables are function from sample space to $R$, then it seems almost sure convergence means that we have pointwise convergence of functions($X_n$'s to $X$'s) except for may be at sets of measure zero? Now I can see how this implies convergence in distribution but how about random variable $Y=lim_{n \rightarrow \infty}X_n$ and $X$, can we say $Y=X$? I can not see how this is true. Because although we have same functions(random variables), underlying randomness(which could be different for Y and X) could generate different $\omega$'s which could then be mapped by $Y$ and $X$ to different values in $R$. Also if this is not true then how is almost sure convergence different from convergence in distribution? Can we have some non-trivial example where the limiting random variable is not constant. I don't have mathematics background so please try to avoid measure theoretic language and sorry for the long and confusing post. Thanks a lot in advance.

1

There are 1 best solutions below

12
On BEST ANSWER

If $Y$ is another limit with probability $1$ then $X =Y$ with probability $1$, that is $P(X\not = Y)=0$, that is, there is a subset of $\Omega$ whose probability is $0$ and if we omit this set then we can say that $X(\omega)=Y(\omega)$ for all $\omega$ (not in that set).

As far as the relationship between the two concept of convergence. Take a specific example over $\Omega=[0,1]$. Let the probability measure be uniform (fined by the length). Consider $X(\omega)=\omega$. This is going to be the limiting random variable. And define the $X_n$s as functions from $[0,1]$ to $R$ the following "walking" way. First divide $[0,1]$ into four subintervals: $[0,1/4],(1/4,1/2],(1/2,3/4],(3/4,1]$. Let $X_1=0$ over $[0,1/4]$ and $\omega$ otherwise. Let $X_2=0$ over $(1/4,1/2]$ and $\omega$ otherwise and so on. Then halve these intervals and define $X_5=0$ over $[0,1/8]$ and $\omega$ otherwise. Then let the abyss "walk" through the $8$ intervals.

The figure below depicts $X_{10}$ and $X_{11}$ the $10$th and the $11$th elements of this walking abyss.

enter image description here

Then halve the current intervals and let walk the next series the similar way. Then halve again. Now, you are going to have narrower and narrower abysses walking through the interval $[0,1]$. The probability that $X_n=\omega$ is getting larger and larger because the interval over which $X_n$ is $0$ is getting narrower and narrower. So, this is convergence in distribution to $X=\omega$. Do we have convergence with probability $1$? No! Because for any $\omega$ there will be always an $n$ for which the slimmer and slimmer walking abyss will make $X_n=0$ for that $\omega$. What is more, $P(\lim X_n(\omega)=\omega)=0$.