Let $X_0, X_1, X_2,...$ be a Markov Chain on state space $S=\{1, 2,..., n\}$
and let $P$ be the Transition Matrix of the above Markov chain
Prove that
$\Bbb{P}(X_{t+2}=j|X_t=i) = (P^2)_{ij} $
for all $1\le{i, j}\le{n}$, and $t=0,1,2,3,...$
I know that that $P^2=\begin{pmatrix} p^{(2)}_{11} & p^{(2)}_{12} & \cdots& p^{(2)}_{1n} \\ p^{(2)}_{21} & p^{(2)}_{22} & \cdots& p^{(2)}_{2n} \\ \vdots & \vdots & & \vdots \\ p^{(2)}_{n1} & p^{(2)}_{n2} & \cdots& p^{(2)}_{nn} \\ \end{pmatrix}$
But I am unsure how to complete the proof
The Definition of a Markov Chain is as follows:
A Stohastic Process $X_0,X_1,...$ on a statespace S is a Markov Chain if for all $t\in{\Bbb{N}}$ satisfies
$\Bbb{P}(X_{t}=s|X_{t-1}=s_t,X_{t-2}=s_{t-1},..,X_0=s_0) = \Bbb{P}(X_{t}=s|X_{t-1}=s_t)$
for all $s,s_t,s_{t-1},...,s_0\in{S}$ for which the conditional probabilities are defined
Conditioning on the value of $X_{t+1}$, we have, for any $i,j$, \begin{eqnarray*} P(X_{t+2}=j\mid X_t=i) &=& \sum_{k=1}^{n} P(X_{t+2}=j\mid X_{t+1}=k,X_t=i) P(X_{t+1}=k\mid X_t=i) \\ &=& \sum_{k=1}^{n} P(X_{t+2}=j\mid X_{t+1}=k) P(X_{t+1}=k\mid X_t=i) \qquad\text{by the Markov property} \\ &=& \sum_{k=1}^{n} p_{kj}p_{ik} \\ &=& (P^2)_{ij}. \end{eqnarray*}