Absorbing Markov chain when less transient states than absorbing states

330 Views Asked by At

I have a probability matrix:

   1    2    3
1  0.5  0.3  0.2
2    0    1    0
3    0    0    1

I understand that:

$$ Q = \left(\begin{array}{c} 0.5 \end{array} \right) \\ R = \left(\begin{array}{c} 0.3 & 0.2 \end{array} \right) \\ I = \left(\begin{array}{cc} 1 & 0 \\ 0 & 1\end{array} \right) $$

I calculated the fundamental matrix $ N $ to be:

$$ N = (I - Q)^{-1} = \left(\begin{array}{cc} 1 & 0 \\ 0 & 2 \end{array} \right) $$

But when I try to calculate the probability matrix, I can't do the multiplication because it's a $ 2 \times 2 $ matrix against a $ 1 \times 2 $ matrix. =S

$$ B = \left(\begin{array}{cc} 0.5 & 0 \\ 0 & 1\end{array} \right) \left(\begin{array}{c} 0.3 & 0.2\end{array} \right) $$

1

There are 1 best solutions below

0
On

So, as @Arkamis pointed out, what I had wrong is that $ (I-Q)^{-1} $ shouldn't be a $ 2 \times 2 $ matrix.

I started with a mistake in $ I - Q $ like this, which is wrong:

$$ I - Q= \left( \begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array} \right) - \left( \begin{array}{cc} 0.5 & 0 \\ 0 & 0 \end{array} \right) $$

One isn't to "expand" $ Q $ to fit $ I $. Instead, it should make $ I $ whatever the size of $ Q $:

$$ I - Q = (1) - (0.5) $$