Question
Let $X_1, X_2, \dots \sim F$ be an i.i.d. sequence of observations such that $X_1 \geq 0$ and $E X_1 = \infty$. Set the sample mean $\overline X_n := \frac 1n \sum_{i=1}^n X_i$.
We know that $\overline X_n \overset{\text{a.s.}}{\longrightarrow} \infty$. But I'm looking for a result on how fast this occurs. (Not necessarily almost surely.)
For example, a sequence $w_n \to \infty$ such that $P(\overline X_n > w_n) \to 1$ for $n \to \infty$.
This could be some function of $F$. So maybe we'd say a sequence $\big(w_n(F)\big)_{n \in \mathbb N}$.
Thoughts
Since $P(\overline X_n \leq w_n) = P(\sum_i X_i \leq nw_n) \leq P(\max_i X_i \leq nw_n) = P(X_1 \leq nw_n)^n = F(nw_n)^n$. So if $w_n$ is slow enough that $F(nw_n)^n \to 0$ as $n \to \infty$, then $(w_n)$ works as long as $w_n \to \infty$.
The problem is that if $1-F(x) \approx 1/x$ for large $x$, then there does not exist such a sequence $(w_n)$, since $F(nw_n)^n$ would then only work if $w_n \to 0$:
Suppose $w_n \to \infty$. Then $$ \begin{align} F(nw_n)^n \approx \Big[ 1 - \frac{1}{nw_n} \Big]^n &= \bigg( \Big[ 1 - \frac{1}{nw_n} \Big]^{nw_n} \bigg)^{n/(nw_n)}\\ &= \exp \left( \frac{n}{nw_n}\log \bigg( \Big[ 1 - \frac{1}{nw_n} \Big]^{nw_n} \bigg) \right) \\ &= \exp \left( \underbrace{\frac{1}{w_n}}_{\to 0} \underbrace{ \log \bigg( \Big[ 1 - \frac{1}{nw_n} \Big]^{nw_n} \bigg)}_{ \to -1} \right) \to \exp(0) = 1. \end{align} $$ I'd like to find a $(w_n)$ that would work for any distribution such that $E X_1 = \infty$.
This is not a full answer, but one can look at the Marcinkiewicz-Zygmund law of large numbers to gain some insight (I don't have an online reference for this theorem but see for instance Section 6.7 in Allan Gut's book Probability: A Graduate Course). Part of this law states that if $E[X_1^r]<\infty$ for some $0<r<1$, then $$\frac{1}{n^{1/r}}\sum_{i=1}^n X_i \to 0 \; \text{a.s.}$$ This result implies that $w_n=n^a$ for some $a>0$ would not work for any distribution: simply consider $X_1$ such that $E[X_1]=\infty$ but $E[X_1^r]<\infty$ for $r\geq 1/(1+a)$.
Of course, one could still have slower rates of convergence, e.g. logarithmic rate. But my sense is that a similar phenomenon would occur if one considers distributions such that $E[X_1/\log(1+X_1)^a]<\infty$ for some $a>0$.
Note that if we give up on the uniformity requirement, there is a converse to Marcinkiewicz-Zygmund law of large numbers: if $E[X_1^r]=\infty$, $$\limsup \frac{1}{n^{1/r}}\sum_{i=1}^n X_i =\infty \; \text{a.s.}$$ So $w_n=n^a$, $a>0$ works if $E[X_1^{1/(1+a)}]=\infty$.