Let $S$ be the state space (finite) of a Markov chain. We write $p(x, y)$ to denote the probability of transitioning from $x\in S$ to $y\in S$ in one step. We then naturally get a Markov chain on $S^k$, where the probability of transitioning from $(x_0, x_1,\ldots, x_{k-1})$ to $(x_1,x_2,\ldots, x_k)$ is $p(x_{k-1}, x_k)$.
Are there some general results known about such chains? This seems like a very natural concept but I am not able to locate any source where this is discussed.
As an example, consider $S=\{a, b\}$ where $p(a, b)=p(b, a)=1$. Even though this chain is irreducible, it can be easily seen that the induced chain on $S^2$ is not irreducible.