Repair Chain (Markov Chain Sample Model)

856 Views Asked by At

A machine has $3$ critical parts that are subject to failure, but can function as long as two of these parts are working. When two are broken, they are replaced and the machine is back to working order the next day. To formulate a Markov chain model we declare its state space to be parts that are broken ${0,1,2,3,12,13,23}$. If we assume that parts $1,2$ and $ 3 $ fail with probabilities $0.01, 0.02,$ and $ 0.04$ ,respectively but no two parts fail at the same day, then we arrive at the following transition matrix.

$$ \begin{matrix} & 0 & 1 & 2 &3 &12 &13 &23 \\ 0 & 0.93 & 0.01 & 0.02 &0.04 &0 &0 &0 \\ 1 & 0 &0.94 & 0 & 0 &0.02 &0.04 & 0 \\ 2 & 0 & 0 & 0.95 & 0 &0.01 &0 &0.04 \\ 3 & 0 & 0 & 0 &0.97 &0 &0.01 &0.02 \\ 12 & 1 & 0 & 0 &0 &0 &0 &0\\ 13 & 1 & 0 & 0 &0 &0 &0 &0\\ 23 & 1 & 0 & 0 &0 &0 &0 &0\\ \end {matrix}$$

I don't have any problem with the interpretation of $p(0,0), p(0,1), p(0,2)$ and $p(0,3)$. It says there that given everything is working fine, there is $93$% chance everything will still be fine. $1$% of part $1$ failing, $2$% of part $2$ failing, and $4$% of part $3$ failing. What I don't get is everything underneath.