I'm curious about what I believe to be a necessary condition for a sequence of partial sums $S_n$ of i.i.d. random variables $X_i$ to converge almost surely to $+\infty$. Intuitively it seems obvious that $EX_i \neq 0$ is necessary, but I cannot prove it.
Some things I'm sure won't work: SLLN (can't be used because it could be that $E|X_i| = +\infty$), and I don't think Kolmogorov's three-series theorem is at all useful here. Neither are the Monotone or Dominated Convergence theorems, clearly. Other ideas? Or is there a counterexample to the statement?
You can prove this exercise using Wald's equation, proven just below in Durrett's book.
Suppose that $E X_i=0.$ If $E\alpha$ were finite, then Wald's equation would give $0<ES_\alpha=EX\, E\alpha=0.$ This contradiction shows that $E\alpha=\infty$ and so $P(\bar\beta=\infty)=0.$ This random walk cannot stay strictly positive at all times.