Let $(X_n)$ a sequence of i.i.d random variables, such that $P(X_1 \geq 0)=1$ and $P(X_1 >0) > 0$. Show that $\sum_{n \geq 1}X_n =\infty$ almost surely.
We have that $P(X_1 > 0)= \underset{n \rightarrow \infty}{\lim}P(X_1 > \frac{1}{n})$. So by the property of the limit, there exists a sufficiently big $N \in \mathbb{N}$ such that $\forall n \geq N$ we have $P(X_1 > \frac{1}{n})>0$. Let then put $A_n = \{X_1 \geq \frac{1}{N} \}$. We then have:
$$\sum_{n \geq 1} X_n \geq \sum P(A_n) \geq M + \sum_{n\geq N} P(X> \frac{1}{N}) = M + P(X>\frac{1}{N})\sum_{n\geq N}1$$ Which equals to infinity. Is my reasoning correct?
I am very unsure, as $\sum_{n\geq 1}X_n$ is ambiguous to me, I am unsure what it exactly represents.
$\sum_n X_n$ is another random variable, just as $X_1, X_2$, etc.; but there is a relevant detail though.
Since for every element $\omega$ in the sample space $\Omega$ (that is, for every possible result of the random experiment that undergoes the whole situation and calculations) each $X_n$ takes some real value $X_n(\omega)$ —remember that the $X_n$ random variables are by definition functions $$X_n\colon \Omega \longrightarrow \mathbb R$$ (for each $n\in\mathbb N$)— then for each $\omega$ also $\sum_n X_n$ takes a value, namely, $$\sum_n X_n(\omega)=X_1(\omega)+\cdots+X_n(\omega)+\cdots.$$
BUT: it could be the case that this series is divergent, and in that case the value of $\sum_n X_n (\omega)$ would be $\infty$. Anyway, it's just a matter of saying that this is a r.v. which takes values not in $\mathbb R$ but in $$\overline{\mathbb R}=\mathbb R \cup \{\infty\}.\quad(*)$$
Now, to say that an event $A$ happens almost surely is just to say that $P(A)=1$. So you are just asked to show that $$P\left(\sum_n X_n = \infty\right)=1$$ that is $$P(A)=1,$$ $A$ being $$A=\{\omega \in \Omega \colon \sum_n X_n(\omega) \text{ diverges}\}.$$
And you can get more and more specific, as in $$A=\{\omega \in \Omega \colon \forall M>0 \exists N\in \mathbb N(n\ge N \implies X_1(\omega)+\cdots+X_n(\omega) \ge M\},$$ and so on.
This is just theory and an explanation of the concepts involved, but since your hypothesis are not given in terms of 'omegas' and since you actually don't know much about the probability that is given to the sample space, you don't need to make those $\omega$ explicit. You don't even need to write it as probabilities, since you can just work with the $X_n$ and $\sum_n X_n$ as mere numbers as you did, as long as you mention that your conclusion holds 'a.s.' or with probability one.
So in your proof, if you're sure that $\sum_n X_n$ is greater than the last expression almost sure and that this last expression diverges almost sure, then you're done.
(*) The only drawback of this approach is that instead of the borelian sets of $\mathbb R$ we have to consider a $\sigma$-algebra over $\overline{\mathbb R}$ to determine for which sets $B$ the expressión $P(\sum_n X_n \in B)$ makes any sense. But the borelians of $\overline{\mathbb R}$ is a well defined family of sets, so there's no such problem.
If you still wanted to avoid this approach, you would have to write the event $\{\sum_n X_n = \infty\}$ in terms of simpler events which do not involve the idea of taking the value $\infty$.