Does there exist a random variable $X$ with $\mathbb{E}X = \infty$ and some constants $a_n \to \infty$ such that if $X_1, X_2, \ldots$ are iid $\sim X$, then $$\lim_{n \to \infty} \frac{X_1 + X_2 + \cdots + X_n}{a_n} \to Z$$ for some non-trivial random variable $Z$, i.e. $Z \in (0,\infty)$? (Choose your favorite mode of convergence: distribution, probability, almost sure.)
I suspect this is impossible. Can we prove this? (If I've missed something simple, feel free to suggest an additional assumption that rules out a trivial example or a variant that makes this easier.)
Clearly $Z$ would have to have $\mathbb{E}Z = \infty$, and $a_n$ would satisfy $a_n \gg n$.
One thought is something along the lines of $P(X = n!) = 2^{-n}$ for $n \geq 1$, i.e. where the sum of the $X_i$ is basically equal to the maximum of the $X_i$. Then if the maximum scales 'smoothly' enough, a limit would exist.
A possibly interesting generalization is: what if the $a_n$ are replaced by a different iid (independent of the $X$'s) sequence $Y_n$? Could
$$\lim_{n \to \infty} \frac{X_1 + X_2 + \cdots + X_n}{Y_n}$$
actually exist?
In Durrett's Probability: Theory and Examples, Example 2.2.7 describes the "St. Petersburg Lottery": Let $T \sim \mathrm{Geo}(1/2)$ and let $X = 2^T$. It is easy to check that $E[X] = \infty$. If $X_1, X_2, \dots$ are iid with the same distribution as $X$, and $S_n = X_1 + \dots + X_n$, he shows in Example 2.2.7 that $S_n / (n \log_2 n) \to 1$ in probability.
On the other hand, Durrett's Theorem 2.5.9 (due to Feller) states that whenever $E|X| = \infty$, there cannot exist any sequence $a_n$ such that $S_n / a_n$ converges a.s. to a finite nonzero limit. In fact, $\limsup S_n/a_n$ is either identically zero or identically $\infty$. So you cannot strengthen "in probability" to "almost surely".