Here is the transition diagram of a Markov chain with two transient states and two absorbing states:
We are asked to find the steady state(s) if existing.
Please tell me if there is something wrong with my reasoning:
First, the steady-state probabilities are represented in a vector $\pi \in [0, 1]^n$ where $\pi _i$ is the probability of finding the Markov chain in state $i$ when the time goes to $+\infty$.
In other words, for a value of $n$ that is large enough, $\pi _i$ represents the portion of time spent in state $i$.
In this case, we have two absorbing states, which means that once we get into one of them, there is no way to exit.
If we apply the previous understanding of steady state (that of time portion...), we will get the following:
- When $n \ \rightarrow + \infty$, the Markov chain will have moved to either state $0$ or state $3$, which means that the portion of time that will be spent in one of these states is $1$ exclusively.
- The above point means that we have two steady states: either $\pi = (1, 0, 0, 0)$, or $\pi = (0, 0, 0, 1)$

TLDR: The steady states are $\{(a,0,0,b)\mid0≤a,b\;;a+b=1\}$.
A steady state is a state where running the next step of the Markov chain does not change the distribution. When all mass is in the absorbing states, nothing changes, so these states are in fact steady states. Every convex combination of steady states is also a steady state.
There can't be mass on state $1$ and $2$, because this would flow into $0,3$; and no mass would flow back.
I think there is a bit of a misconception in the following statement:
Only for irreducible Markov chains have a single steady state. As you have two absorbing states, your chain is not irreducible, and while the Markov goes to some steady state, there is not a single steady state. By symmetry for example, you can see that the Markov chain started in $(0,0.5,0.5,0)$ converges to the steady state $(0.5,0,0,0.5)$.
Furthermore, if the chain is not aperiodic, the time a Markov chain spends in state $i$ until some time $t$ converges, but not the probability that it is in the state $i$ at time $t$. Take for example the Markov chain that always jumps between 0 and 1. The time it spends in each state converge to $\frac12$, but the probability to be in a state is entirely determined by whether the time is even or odd.
A Markov chain on a finite space is called ergodic if it is irreducible and aperiodic. A Markov chain converges to a single steady state if it is ergodic.