Consider a gambling game in which on any turn you win \$1 with the probability $p=0.4$ and you loose \$1 with the probability $p=0.4$. We have that $p(i,j)=p\{X_{n+1}=j\mid X_n=i\}$ and thus, for $N=5$,
$$\big(p(i,j)\big)_{0\leq i,j\leq 5}=\begin{pmatrix}1.0&0&0&0&0&0\\0.6&0&0.4&0&0&0\\0&0.6&0&0.4&0&0\\0&0&0.6&0&0.4&0\\0&0&0&0.6&0&0.4\\0&0&0&0&0&1.0\end{pmatrix}.$$
I don't understand why $p(0,0)=1=p(N,N)$. Morevove, I don't see how to read this matrix. Is for example $p(3,4)=p\{X_4=4\mid X_3=3\}$ ?
And is $p(0,0)=p\{X_0=0\mid X_0=0\}$ and $p(5,5)=p\{X_5=5\mid X_5=5\}$. My interpretation looks strange...
EDIT
Since I don't have any answer from Tlön Uqbar Orbis Tertius, let see this example: Let $X_n$ be the wether on the day $n$. The weather it's not exactly a Markov chain, but we'll suppose that it is. Let $1=rainy$ and $2=sunny$. How would you interpret $$\begin{array}{ccc}&\boldsymbol 1&\boldsymbol 2\\ \boldsymbol 1&.6&.4\\\boldsymbol 2&.2&.8\end{array} \ \ \ \ ?$$ Does $p(1,2)=p\{X-2=2\mid X_1=1\}$ ? If yes, how to compute $p(n,n+1)$ if $n>2$ ? Because here we just have the results for $n\in\{1,2\}$.
Let's look at your weather example:
Let $X_n$ be the wether on the day $n$. The weather it's not exactly a Markov chain, but we'll suppose that it is. Let $1=Rainy$ and $2=Sunny$. How would you interpret $$\begin{array}{ccc}&\boldsymbol 1&\boldsymbol 2\\ \boldsymbol 1&.6&.4\\\boldsymbol 2&.2&.8\end{array} \ \ \ \ ?$$ Does $p(1,2)=p\{X_2=2\mid X_1=1\}$ ? If yes, how to compute $p(n,n+1)$ if $n>2$ ? Because here we just have the results for $n\in\{1,2\}$.
My Explanation:
This means that if you are in state 1 ($Rainy$) at time $n$, there is a 0.6 probability you will be at state 1 ($Rainy$) at time $n + 1$ and a 0.4 probability that you will be at state 2 ($Sunny$) at time $n + 1$.
A Discrete Time Markov Chain describes the evolution of a system (defined by a state space $S = \{s_1, s_2, ..., s_N\}$) over discrete time periods $0, 1, 2, ..., T$.
$X_n$ is the state that your system is in (in your example it is either 1 ($Rainy$) or 2 ($Sunny$)) at time $n$.
Helpful Information Related to Your Question:
$p\{X_2=1\mid X_1=1\}$ = $p\{X_3=1\mid X_2=1\}$ = $p\{X_4=1\mid X_3=1\}$ = P(weather goes from $Rainy$ to $Rainy$ in 1 step) = 0.6
$p\{X_2=2\mid X_1=1\}$ = $p\{X_3=2\mid X_2=1\}$ = $p\{X_4=2\mid X_3=1\}$ = P(weather goes from $Rainy$ to $Sunny$ in 1 step) = 0.4
$p\{X_2=1\mid X_1=2\}$ = $p\{X_3=1\mid X_2=2\}$ = $p\{X_4=1\mid X_3=2\}$ = P(weather goes from $Sunny$ to $Rainy$ in 1 step) = 0.2
$p\{X_2=2\mid X_1=2\}$ = $p\{X_3=2\mid X_2=2\}$ = $p\{X_4=2\mid X_3=2\}$ = P(weather goes from $Sunny$ to $Sunny$ in 1 step) = 0.8
The above is defined by the matrix you provided. Hope this helps!