Discussions of the law of large numbers frequently begin like this:
Let $X$ be a real-valued random variable, and let $X_1$, $X_2$, $\ldots$ be an infinite sequence of i.i.d. copies of $X$. Let $\bar{X}_n := \frac{1}{n}(X_1 + \ldots + X_n)$ be the empirical averages of this sequence.
But this is confusing to me; in particular, this is because I cannot imagine a scenario where (given the constraints above) it is not the case that
$$ X = \frac{1}{n}(X_1 + \ldots + X_n) $$
But it must be possible for the left-hand side of this equation to diverge from the right-hand side (otherwise the law of large numbers wouldn't be interesting).
So does there exist a simple example where
$$ X \ne \frac{1}{n}(X_1 + \ldots + X_n)? $$