I have a Markov Chain with states ${0,1,2,3,4,5}$ and transition matrix $$P=\begin{bmatrix}1&0&0&0&0&0\\ 0.5&0&0.5&0&0&0\\ 0&0.5&0&0.5&0&0\\ 0&0&0.5&0&0.5&0\\ 0&0&0&0.5&0&0.5\\0&0&0&0&0&1\end{bmatrix}$$
The chain starts in state 1. This is similar to random walk/Gambler's ruin.
I want to find a steady state vector and thus I calculated the eigenvector (with all nonnegtaive entries) corresponding to eigenvalue $\lambda=1$.
The vector I calculated as $$\begin{bmatrix}5\\4\\3\\2\\1\\0\end{bmatrix} \rightarrow \begin{bmatrix}\frac{1}{3}\\ \frac{4}{15}\\ \frac{1}{5}\\ \frac{2}{15}\\ \frac{1}{15}\\ 0\end{bmatrix}$$
My question is- how can we have a steady state vector where the probability of the absorbing state $5$ is zero?
I looked over some old lecture notes that state that there is no steady state vector if there are multiple absorbing states. Is that true?
Any help greatly appreciated!
Just stringing together my comments, since they became an answer:
You're calculating right eigenvectors, but if you write your transition matrix with the rows summing to 1, you need to find left eigenvectors to find steady state row vectors. In this case, the left eigenspace corresponding to 1 is 2-dimensional, so there won't be a single absorbing state. Your matrix is not irreducible.
There are infinitely many steady state vectors, which are then obviously not unique. If the Markov chain is irreducible (or if some power of the matrix has strictly positive entries), then this never happens. If the Markov chain is reducible (or all powers of the matrix have zeroes), this sort of thing can happen, but does not necessarily. For instance, if there is exactly one absorbing state and it is accessible from all other states, then the Markov chain is reducible, but there's still a unique steady state (the one with a 1 in the absorbing state and 0 elsewhere).