markov chain: 2 state chain

881 Views Asked by At

I have a machine. It has two states, broken or working. If it is working, then it will be broken with probability $q=0.1$. If the machine is working, I will make \$1000 dollar a day. If it is broken, then repairman will charge charge me \$ $200/(1-p) $ a day to repair. He will fix the machine with probability p. Assume the transition from broken to working (and wise versa) is independent. Find p that maximize the expected profit.

Attempt:

2 state markov chain. Let state 0 be working, and let state 1 be broken. The state transition matrix is:

\begin{pmatrix} 1-q & q \\ p & 1-p \\ \end{pmatrix}

The steady state distribution is calculated by (omit showing my calculation process here since it is well-known) : $\pi_0=q/(p+q), \pi_1=p/(p+q)$

The expected profit is: $1000*\pi_0-200/(1-p)*\pi_1$. (Do you think this is correct)?