What does "steady state equation" mean in the context of Stochastic matrices

92 Views Asked by At

I have very recently started learning about Markov chains. I know what the Stochastic matrix is. However, I came across a question like:

If a transition probability matrix is of order $n\times n$ then number of steady state equations would be:

  1. $n$
  2. $n^2$
  3. $n-1$
  4. $n+1$

I'm not sure what they mean by "steady state equations". Haven't come across that term while learning Markov chains. Could someone please explain or provide some reference?

1

There are 1 best solutions below

4
On

It's a bit unclear out of context, but I'd expect the equations to be $\pi P=\pi$, where $\pi$ is a row vector, along with the normalization requirement $\pi 1=1$ where $1$ is a column vector of all ones. That would be $n+1$ linear equations. If you don't have a normalization requirement, then $n$ would also be a sensible answer.