Suppose we choose a random sequence of signs $a_n \in \{ -1, 1 \}$, with the distribution being according to uniform Lebesgue measure on $\{ -1, 1 \}^\omega \simeq [0, 1]$. I wonder if it's true or false that almost certainly, $\sum_{n=1}^\infty \frac{a_n}{n}$ converges (with the series obviously being either conditionally convergent or divergent in any case).
I'm leaning towards true, with a random sequence almost surely having roughly equal numbers of positive and negative entries as $n \to \infty$. On the other hand, it wouldn't shock me too much to find out the statement is false, or even that the series is almost surely divergent.
(Note this isn't a homework problem or anything, the problem just occurred to me.)
See Kolmgorov's three series theorem
Now let's consider what happens in this example. $X_n = \frac{a_n}{n}$ where the $a_n$ are i.i.d. random variables with each value $+1,-1$ occurring with probability $1/2$.
Take $A = 2$. Then $|X_n| \ge A$ has probability $0$ for all $n$. So (i) is trivial.
Next, $Y_n = X_n$ and $\mathbb{E}(Y_n) = 0$, so (ii) is trivial.
Finally, $\mathrm{var}(Y_n) = 1/n^2$, and $\sum\frac{1}{n^2}$ converges. So (iii) is true.
Therefore, the series $\sum X_n$ converges a.s.
[In fact, this precise example is used to illustrate the three series theorem on that Wikipedia page.]