Consider a sequence of independent and identically distributed random variables $X_i$ with $P(X_i >1 )>0$ and $E(X_i) = 0$.
therefore $M_n = \sum_{i=1}^n X_i$ is a martingale.
I would like to prove that $M_n\geq 0$ infinitely often with probability $1$.
I thought about using upcrossing inequalities, but it seems that they only bound the number of upcrossing from above.
A second idea was to use a related idea, the result that any positive martingale converges almost surely, to deduce the result by contradiction.
The argument would go as follows:
Let $A = \big[\exists\, N,\; M_n<0 \text{ for } n \geq N_0\big]$, and therefore for $\omega \in A$ $M_n$ converges almost surely, but since $P(X_i>1)>0$ $M_n$ does not converge with probability $1$. therefore $P(A) = 0$
Is this argument correct? Is there another one that doesn't require Doob's upcrossing inequalities?
No, your reasoning does not work. The martingale covergence theorem requires $M_n(\omega) \geq 0$ for all $\omega \in \Omega$ (and not just $M_n(\omega) \geq 0$ for some $\omega \in \Omega$). To fix this gap in your reasoning you have to show that
$$\mathbb{P} \left( \limsup_{n \to \infty} M_n < 0 \right) \in \{0,1\}. \tag{1}$$
For this you can use Hewitt-Savage's 0-1-law.
Alternative proof (using $(1)$): Set
$$Z(\omega) := \limsup_{n \to \infty} M_n(\omega),\qquad \omega \in \Omega.$$
Suppose that $\mathbb{P}(Z <0)>0$, then it follows from $(1)$ that $\mathbb{P}(Z <0)=1$. Thus,
$$\mathbb{E}(Z)<0. \tag{2}$$
On the other hand, we have by Fatou's lemma
$$\mathbb{E}(Z) = \mathbb{E} \left( \limsup_{n \to \infty} M_n \right) \geq \limsup_{n \to \infty} \mathbb{E}(M_n) = 0.$$
Obviously, this is a contradiction to $(2)$, and therefore we conclude $\mathbb{P}(Z<0)=0$.