Definition: $p^m (i,j)=P(X_{n+m} =j\mid X_n = i)$.
Claim: Suppose there exists an $m$ such that $p^m (i, j)>0$, and an $n$ such that $p^n (j,k)>0$. Then, $p^{m+n} (i, k)\geq p^m(i, j) p^n(j,k)$.
I try to understand this claim and prove it. My approach is to write it out by Bayesian theorem. The fact is that terms do not cancel out clearly. Is there any hint on how to attack this problem? Thanks in advance!
Update: Do we have "$\geq$" in the claim simply because there may be other ways to get from $i$ to $k$, such as $i\rightarrow v\rightarrow k$ which also takes $m+n$ steps in total?
The left side computes the probability of a path from $i$ to $k$ in $m+n$ steps. The right side would compute the probability of going from $i$ to $j$ in $m$ steps, then going from $j$ to $k$ in $n$ steps. Since there are presumably other paths of lengths $m+n$ that go from $i$ to $k$ in $m+n$ steps besides those that visit $j$ after $m$ steps, an inequality is appropriate.
You can achieve equality by adding a sum over all $j$ to the right side, which may be instructive.