The theorem states:
Let $(P, \pi)$ be a stationary Markov chain, if
$$ E[y_{t+1} | x_t] = y_t $$
then the random variable $y_t = \bar{y}' x_t $ is invariant.
note: we let $\bar{y}$ be an $n \times 1$ vector of real number such that $y_t = \bar{y}_i$ if $x_t = e_i$. Also, $x_t$ (also an $n\times 1$ vector) is the underlying Markov process with transition matrix $P$ and stationary distribution $\pi$
Pf: In a finite Markov chain if $E[(y_{t+1} - y_t)^2] = 0$ then $y_{t+1} = y_t$, then by the law of iterated expectations
$$ E[(y_{t+1} - y_t)^2] = E[E[(y_{t+1}^2 - 2y_{t+1}y_t + y_t^2)|x_t] $$
$$ = E[E[y_{t+1}^2|x_t] - 2E[y_{t+1}y_t|x_t] + E[y_t^2|x_t] $$
$$ = E[y_{t+1}^2] - 2 E[y_t^2] + E[y_t^2] = 0 $$
Why in the proof can they take
$$ E[y_{t+1}^2] = E[y_t^2] $$
As the assumption is for $y_t$ not $y_t^2$
If I may also ask, why does a finite Markov chain have the property when $E[(y_{t+1} - y_t)^2] = 0$ then $y_{t+1} = y_t$