Correct reasoning? About an iid random variable series

92 Views Asked by At

The statement I want to prove is that, given a sequence $\lbrace X_n \rbrace_n$ of independent identically distributed random variables, non degenerated in $0$, that is, none of them are almost sure $0$, then

$P(\sum_{i=1}^n X_i$ converges$)$ = $0$


I have tried the following:

Let ($\Omega, \sigma$, $P$) be the measure space where the random variables are defined.

$\{\sum_{i=1}^n X_i$ converges $\}$ is an asymptotic event related to the independent random variables $\lbrace X_n \rbrace_n$, then, by the $0-1$ Kolmogorov's law we have $P(\sum_{i=1}^n X_i$ converges$)$ $\in \{0,1\}$.

Notice that $(\{\sum_{i=1}^n X_i$ does not converge $\}$ = $\{\sum_{i=1}^n X_i$ converges $\}^c$.

As the variables are identically distributed we have for all $\omega \in \Omega$ that $X_i(\omega)$ = $X_1 (\omega)$. Thus we have $\sum_{i=1}^n X_i(\omega) = \sum_{i=1}^n X_1(\omega) = n\cdot X_1(\omega)$. (I don't know if this is correct)

Now, as the random variables are non-degenerated in $0$, we have that there should exist an element $\omega_0 \in \Omega$ such as $P(\omega_0) > 0$, with $X_1(\omega_0) = k \neq 0$, $k \in \mathbb{R} $.

Therefore we have that $\sum_{i=1}^n X_i(\omega_0) = \sum_{i=1}^n X_1(\omega_0) = n\cdot X_1(\omega_0) \longrightarrow \infty$ as $n \longrightarrow \infty$.

Then $\omega_0 \subset \{\sum_{i=1}^n X_i$ does not converge$\}$ and $P(\{\sum_{i=1}^n X_i$ does not converge $\}) \geq P(\omega_0) > 0$, but by $0-1$ Kolmogorov's law that must be $1$, and then $P(\sum_{i=1}^n X_i$ converges$)$ = $0$.

1

There are 1 best solutions below

2
On BEST ANSWER

No, variables been identically distributed doesn't mean $X_i(\omega) = X_1(\omega)$.

However, we don't need any complex tools to show that the series diverges a.s.

As $X_i \neq 0$, we have $P(|X_i| > \epsilon) > \epsilon$ for some $\epsilon$.

Recall that if the series converges, all terms after $n$-th for some $n$ is less than $\epsilon$ (necessary condition of series convergence). So $$P(\text{series converge}) \leq P(\exists n \forall i > n: |X_i| < \epsilon) \leq \sum_n P(\forall i > n: |X_i| < \varepsilon)$$

$$P(\forall i > n: |X_i| < \epsilon) \leq\\ P(|X_{n + 1}| < \epsilon, P(|X_{n + 2}| < \epsilon, \ldots, P(|X_{n + N}| < \epsilon) =\\ P(|X_1| < \epsilon)^N <\\ (1 - \epsilon)^N \xrightarrow{N \to \infty} 0$$

So $P(\forall i > n: |X_i| < \epsilon) = 0$. By sigma-additivity, $\sum_n P(\forall i > n: |X_i| < \varepsilon) = 0$too. And so probability of series convergence is zero too.