I want to proof that if a sequence of independent and non negative random variables $(X_n)$ and $S_n = \sum_{i = 1}^{n} X_i$ then if $S_n$ converges in probability to $S$ then $S_n$ converges almost surely to $S$.
Now, because $(X_n)$ are independent then because of Kolmogorov 0-1 law, the domain $B$ where it diverges is trivial : either $P(B)=1$ or $P(B)=0$
I want to show that $P(B)=1$ contradicts the convergence if probability but I don't know how to do it. Because $(X_n)$ are non negative, the domain $B$ is where $S_n$ is not bounded, but I don't know how to define a $\varepsilon_B$ such that $ \lim_{n \rightarrow \infty} P( |S_n - S | \geq \varepsilon_B ) = 1$ It is true that $S_n$ diverges but I can't control $|S_n - S|$ because it depends implicitly of the event in $B$ If anyone can give me hint. Thank you for your help.