I am trying to understand the proof for the following theorem:
Given a sequence of independent identically distributed random variables $X_n$ with common mean $\mu = \mathbb{E}[X_n]$ $\forall n$ and $\mathbb{E}[|X_1|] < \infty$, then the Weak Law of Large Numbers holds:
$$\lim_{n\rightarrow \infty}\dfrac{S_n}{n}\xrightarrow{\mathbb{P}}\mu$$
Now my question is, what is the intuition of $\mathbb{E}[|X_1|]<\infty$? If I understood correctly, this theorem is an alternate for the WLLN, without the assumption of existence of variance. If $\mathbb{E}[X_1]=\mu$, how does that not imply that $\mathbb{E}[|X_1|]<\infty$?
Can you provide an example such that $\mathbb{E}[X]=\mu$ and $\mathbb{E}[|X|]=\infty$ as a counterexample?
As remarked in another solution, the mean $\mathbf E[X]$ is only defined when $\mathbf E[|X|]$ is defined.
The version of the weak law that you state above, requiring only $\mathbf E[|X|] < \infty$ is in fact the standard defition of the weak law of large numbers.
So why do we often see the weak law stated with the condition $\text{Var}(X) < \infty$. The answer is that this version of the weak law is significantly easier to prove.
Proof of WL under finite variance.
If $\text{Var}(X_1) = \sigma^2 < \infty$, and if we denote $\overline X_n = n^{-1} \sum_{i=1}^n X_n$, then we can use Chebyshev's Inequality to note
$$ \mathbf P \left[ | \overline X_n - \mu | > \epsilon \right] \leq \frac{\sigma^2}{n \epsilon^2},$$ which converges to $0$ for any fixed $\epsilon$.
In the absence of finite variance.
We do not prove this in detail, but rather sketch the idea.
The idea is to define truncated variables $Y = X$ if $|X| < C$ and $Y = 0$ otherwise. The random variables $Y$ neccessarily have finite variance, and hence satisfy a weak law of large numbers (by the above).
A technical proof is then required to relate the weak law for this truncated sequence to the full weak law. A proof is given here.