If P is a $3\times 3$ transition matrix. Every state has a chance of going to every other state including itself. Therefore this is not an absorbing markov chain. What I want to be able to calculate is the probability of me being in a particular state. I know how to caluclate it after n steps, but surely this can keep changing for higher values of n?
Thanks
A stationary distribution $\mathbf{\hat{p}}$ is a (left) eigenvector of $\mathbf{P}$ with eigenvalue $1$. Note that any nonzero multiple of $\mathbf{\hat{p}}$ is also an eigenvector of $\mathbf{P}$ but the stationary distribution $\mathbf{\hat{p}}$ is fixed by being a probability vector; that is, its components sum to one.
When you want to find the stationary distribution $\mathbf{\hat{p}}$ of your Markov chain with the $3 \times 3$ probability matrix you can solve the system of equations
$$\mathbf{\hat{p}}\mathbf{P} = \mathbf{\hat{p}}$$
with $\mathbf{P}$ = $ \begin{bmatrix} a & b & c \\ d & e & f \\ g & h & i \\ \end{bmatrix} $
you get
$ap_1 + dp_2 + gp_3 = p_1$
$bp_1 + ep_2 + hp_3 = p_2$
$cp_1 + fp_2 + ip_3 = p_3$
And as $\mathbf{\hat{p}}$ is a probability vector it must also fulfill $p_1 + p_2 +p_3 = 1$.
Solving this you obtain the stationary distribution:
$\mathbf{\hat{p}}= [p_1 \:\: p_2 \:\: p_3]$