A precondition for the law of large numbers is that $X$ is a random variable with $X_1$, $X_2$, $\ldots$ being a sequence of i.i.d. random variables s.t. $E[X] = E[X_i]$.
Now suppose $X$ is the traditional random variable associated with tossing a fair coin. In particular, $X(0) = 0$, $X(1) = 1$, $P(X = 0) = P(X = 1) = 0.5$.
We can then assume that, in our case,
$X_1 = 1$ is the random variable associated with the coin coming coming up heads $X_0 = 0$ is the random variable associated with the coin coming up tails.
Question: Isn't it the case that
$$E[X_0] = 0.5*0 + 0.5*0 = 0 \ne 1 = 0.5*1 + 0.5*1 = E[X_1]$$
so that this context already cannot satisfy the hypothesis of the law of large numbers (since it violates the assumption that the expected value of all of the random variables from the sequence $X_i$ must be equal)? If so, this is confusing since a sequence of coin flips is frequently employed as a motivating example for the law of large numbers.