(Markov property) Suppose we have a Markov chain X = {$X_t$}$^∞_0$ with a countable state space S. For each t in {0, 1, 2,... }, the random variable $X_t$ takes values in S. The matrix P is stochastic, that is, p(i, j)>= 0 for all i, j in S and
$\sum\limits_{j\in S} p(i,j)=1$ for all $i \in S$.
(Markov property) For any t in {0, 1, 2,... } and for any states $i_0,...,i_{t-1},$ i, j in S, $P(X_{t+1} = j | X_t = i, X_{t-1} = i_{t-1},...,X_0 = i_0) = p(i, j).$ For this reason, p(i, j) is called a transition probability.
Using that Markov property, show the following strengthened version of the Markov property:
Let t ≥ 0, x, y ∈ S, and A ⊆ $S^t$ . Then $P(X_t+1 = y | X_t = x,(X_0, . . . , X_t−1) ∈ A) = P(X_t+1 = y | X_t = x) = P_x(X_1 = y)$