In class, I've been introduced to the following definition of a homogeneous Markov Chain:
Definition. A Markov Chain $X = (X_t, \, t \in T)$ is said to be homogeneous if and only if the transition probabilities are stationary, i.e., if $p_{i,j}^{(m,n)}$ depends only on the length $|m-n|.$ In this case, we can write
$$ p_{i,j}^{(n)} = P(X_{m+n} = j \, | \, X_{m} = i).$$
My doubt. I've seen (with a quick browse on Google) that sometimes people restrict this definition to the case $|m-n| = 1$. Essentially, in what I've seen, people say that a Markov Chain is homogeneous if and only if $p_{i,j}^{(m,m+1)}$ doesn't depend on $m$. I was wondering if these two definitions are equivalent and if so, why.
My attempt. Let's first assume that $X$ is a homogeneous Markov Chain the bold definition sense. In this case, we can simply pick $n = m+1$ and then we have that $p_{i,j}^{(m,m+1)}$ doesn't depend on $m$, showing that the bold definition implies the one I stated in "My doubt".
On the other hand, let's suppose that $X$ is a homogeneous Markov Chain in the sense of the "My doubt" text and fix some elements $n,h \in T$. Then
$$ p_{i,j}^{(n,n+h)} = \sum_{k_i \in E} p_{i,k_1}^{(n,n+1)} p_{k_1,k_2}^{(n+1,n+2)} \dots p_{k_{h-1},k_h}^{(n+h-1,n+h)} = \sum_{k_i \in E} p_{i,k_1} p_{k_1,k_2} \dots p_{k_{h-1},k_h},$$
which depends only on the length $h$. Therefore, this shows equivalence between the two definitions.
Is my logic correct?
Thanks for any help in advance.
For a discrete-time Markov chain, the two definitions are equivalent, and the proof in the question is exactly why they are equivalent. If the Markov chain also has a finite (or at least countable) state space, then we can also prove this with matrices. Let $P_n$ be the matrix whose $(i,j)^{\text{th}}$ entry is $p^{(n,n+1)}_{i,j}$. Then the product $P_n P_{n+1} \dots P_{m-1}$ is exactly the matrix whose $(i,j)^{\text{th}}$ entry is $p_{i,j}^{(n,m)}$. If every matrix $P_n$ is equal to the same matrix $P$, then every product of this type is just equal to $P^{m-n}$, so we conclude that $p_{i,j}^{(n,m)}$ only depends on $m-n$.
We may prefer the single-step definition because that's how homogeneous Markov chains are usually specified, or the more broad definition because it's easier to apply.
I assume that the question only had discrete-time Markov chains in mind. But for the sake of completeness...
...for a continuous-time Markov chain, we need the broad definition; it is not enough to check what happens between time $n$ and time $n+1$, because transitions can also happen at time $n+\frac12$ or any other time in the interval $[n,n+1)$. In fact, we can think of a discrete-time Markov chain as an inhomogeneous continuous-time Markov chain where the transition probability $p^{(m,n)}_{i,j}$ depends on the number of integers in the interval $[m,n)$.