Prove that if the number of states in a Markov Chain is $M$, and if state $j$ can be reached from state $i$, then it can be reached in $M$ steps or less.
To me it just seems the definition of multistep transition where $$p^m(i,j)=P(X_{n+m}=j|X_n=i)$$
But what I tried to do is
Let $i_0,i_1,\dots,i_m$ the states of Markov Chain, if exists a path from $i$ to $j$, then for $k=0,\dots,m-1$ $$P_{i_k,i_{k+1}}>0$$ such that $i_0=i$ and $i_m=j$ then any state can be reached from another in $n\leq m$ steps, where $n\in\mathbb{Z^+}$.
Here is an informal proof; it isn't that hard to see how to formalize it, but the notation for doing so is a bit tedious.
Suppose you have a possible path $(i_0,i_1,\dots,i_n)$, where $i_0=i,i_n=j$, and $i_{k_1}=i_{k_2}=r$. Then it's possible to go directly from $r$ to $i_{k_2+1}$, and it's possible to get from $i_{k_2+1}$ to $j$. That means that $(i_0,i_1,\dots,r,i_{k_2+1},\dots,i_n)$ is a possible path which is shorter than the one you started with. Repeat this procedure until there are no identical states in your path. Conclude that the resulting path is of length at most $M$ (including the initial state).