I'm reading about the Law of Large Numbers in the book Probability Theory: Independence, Interchangeability, Martingales, by Yuan Shih Chow and Henry Teicher:
Theorem 4 (Feller): If $\{X_n\}$ are i.i.d then $ (S_n-C_n)/n $ converges in probability to $0$ for some choice of $(C_n)$ iff $nP(|X_1|>n) \rightarrow 0$. And then $\lim C_n/n=E(X_1)$.
But what if $X_n$ has no expected value?
Assume that $ nP(|X_1|>n)\rightarrow 0$. Is $C_n/n$ still converging to some limit, and what is the limit?
My first though was "this must be wrong", so I constructed a simple, positive valued distribution $$ X=\frac{1}{U\cdot\ln(1+1/U)}\text{ where }U\sim\text{Uniform}[0,1] $$ which satisfies $n\cdot\Pr[X>n]\rightarrow 0$ as $n\rightarrow\infty$, but has $\text{E}[X]=\infty$.
The following are observations, not proofs, but should be sufficient to understand what is going on.
If you sample values of $S_n/n=(X_1+\cdots+X_n)/n$, they tend to be clustered together around a value $C_n/n$, but with some extreme values in the upper tail.
As $n$ increases, the centre of the cluster $C_n/n$ will increase slowly without converging, while the distribution will become narrower (in the sense that for any $a>0$ you have $\Pr[|S_n/n-C_n/n|>a]\rightarrow 0$ as $n$ increases). In other words, $S_n/n-C_n/n$ will converge in probability to zero, but $C_n/n\rightarrow\infty$ as $n\rightarrow\infty$.
The criterion $n\cdot\Pr[X>n]\rightarrow 0$ is important because it ensures (almost surely) that the jumps in the sequence $S_n/n$ will be smaller and smaller as $n$ increases. However, when the expected value is infinite (or undefined), these "jumps" caused by the extreme values of the distribution are sufficiently frequent to ensure $S_n/n$ will keep increasing rather than converge.