Sure vs almost sure convergence for a simple random variable

42 Views Asked by At

I thought I totally got it until I faced a simple problem and realized I'm getting a contradiction. For a sequence of independent simple rv defined on Lebesgue measure $(\Omega, \mathcal{F}, \mu)$ on $[0,1]$: $$ X_n (\omega) = \bigg\{ \begin{array}{;r} 1 & \text{if } \omega \in (0, \frac{1}{n})\\ 0& otherwise \end{array} $$ the limit is $X(\omega)=0 \ \forall \ \omega \ \in \ \Omega$. If I fix $\omega$, then $|X_{n \geq \frac{1}{\omega}} - X(\omega)|=0$, and this is true $\forall \ \Omega$, so $\lim_n X_n(\omega) = X(\omega) \ \forall \ \Omega.$, i.e. sure convergence. At the same time, to get convergence almost surely, I find $$ \mu(\{\omega:|X_n(\omega) \neq X(\omega)|\}) = \frac{1}{n} \Rightarrow\sum_n\mu(X_n (\omega) \neq X(\omega)) = \sum_n \frac{1}{n} \to \infty $$
Therefore, $X_n (\omega) \not\to X(\omega)$ a.s., because it is $=1$ i.o. w.p. $1$ (it's easy to see it converges in probability because $\frac{1}{n} \to_n 0$).

So I'd be grateful if someone could point out the flaw in my my logic.

1

There are 1 best solutions below

7
On BEST ANSWER

This is irrelevant: $$ \mu(\{\omega:|X_n(\omega) \neq X(\omega)|\}) = \frac{1}{n} \Rightarrow\sum_n\mu(X_n (\omega) \neq X(\omega)) = \sum_n \frac{1}{n} \to \infty $$ From this you cannot conclude $X_n(\omega) \ne X(\omega)\text{ i.o.}$.

Perhaps you are trying to use the Borel-Cantelli lemma, but that would require that the events $\{X_n (\omega) \neq X(\omega)\}$ are independent.