(Not homework. Thought it was an interesting question and wanted to spark discussion.)
In what follows, $\mathbb{N} = \mathbb{Z}\:\cap\:[1,\infty)$.
For $n\in\mathbb{N}$, let
- $I_n = (-\frac{1}{n}, \frac{1}{n})$,
- $X_n\sim Uniform(I_n)$ a sequence of pairwise independent random variables, and
- $X = \displaystyle\sum_{n\in\mathbb{N}} X_n$.
- Find, with proof, $\mathbb{P}(X < \infty)$.
- If $\mathbb{P}(X < \infty) < 1$ (resp. $= 1$), then find a sequence of absolutely continuous random variables $Y_n$ on $I_n$ such that $\mathbb{P}\Big(\displaystyle\sum_{n\in\mathbb{N}} Y_n < \infty \Big) = 1$ (resp. $<1$), or prove that no such sequence exists.
Partial solution (thanks to Daniel Schepler): The answer to (1) is $\mathbb{P}(X < \infty) = 1$. Indeed, Kolmogorov's two-series theorem says that a series $\sum_n X_n$ of independent random variables, with $\mathbb{E}(X_n) = \mu_n$ and $\mbox{Var}(X_n) = \sigma^2_n$, converges a.s. if both $\sum_n \mu_n$ and $\sum_n \sigma_n^2$ converge. Now, $\mu_n \equiv 0$ and $\mbox{Var}_n = \frac{1}{3n^2}$, so both series converge and therefore $\sum_n X_n$ converges almost surely.
Kolmogorov's Three-Series Theorem for pairwise NQD random variables has seemingly been proven here, as claimed here and here, which clearly covers pairwise independent random variables as a special case. Then the result holds by choosing $A=2$, since then all three series trivialize. This means that $X$ is well-defined, i.e. the limit exists, and we have $|X|<\infty$ almost surely.
For the second item, we clearly have to violate an assumption of the theorem. If we can choose a different distribution, choose $X_n$ independent and absolutely continuous with a pdf that has support $(-1/n,1/n)$ and $\mathbb P(X_n\le 1/(2n))<\varepsilon^{n}$, then the probability that there exists $k$ with $X_k\le 1/(2k)$ is at most $\frac{\varepsilon}{1-\varepsilon}$ using the geometric series and the union bound, so the series diverges with probability close to $1$ (we could even get $1$). Notice that the variance of each $X_n$ is bounded by $2/n^2$ (and is non-negative), so this series necessarily converges. Also, the tail probability converges for any $A$ because it is eventually $0$, so we have to violate NQD or expectation. For an example that violates independence, flip a fair coin and choose the sequence $(X_n)_n=(1/n)_n$, otherwise choose $(X_n)_n=(-1/n)_n$, then the expectation is $0$ everywhere, the variance is $1/n^2$ at each point, but the series diverges almost surely because it diverges for both outcomes. Proceeding as above, we achieve a similar result with absolutely continuous marginal distributions (choose your favorite pdf for $X_1$ and set $X_n=X_1$ almost surely).
For a concrete example for the pdfs, connect $(1/n,0)$ with $(3/(4n),4n(1-\varepsilon^n))$, $(1/(2n),0)$, $(1/(2n),2n\varepsilon^n/3)$, $(-1/n,2n\varepsilon^n/3)$ and $(-1/n,0)$ with line segments, then you even have a pdf with only two discontinuities. We could also construct a smooth one.