Let $\{X_n | n\ge 0\}$ be a Markov chain and $p_{ij}(n) = P(X_n = j | X_0 = i)$.
For $1\le m<n$ we get:
$$p_{ij}(n) = \sum_{k\in S} P(X_n=j|X_m = k, X_0 = i)P(X_m=k| X_0=i) = \sum_{k\in S} p_{kj}(n-m)p_{ik}(m)$$.
My question is trivial - why $P(X_n=j|X_m = k, X_0 = i)= p_{kj}(n-m)$?
$$P(X_n = j \mid X_m = k, X_0 = i) = P(X_n = j \mid X_m = k) = P(X_{n-m} = j \mid X_0 = k) =: p_{kj}(n-m).$$
The first equality is due to the Markov property. The second equality is due to time-homogeneity of the Markov process.