Umbrella Markov chain problem

1k Views Asked by At

A man has an umbrella, and he commutes from his house to work and back. If it is raining, and he has an umbrella, he takes his umbrella. If it is not raining, or he does not have an umbrella, he does not take it. I am trying to establish the transition probabilities for the Markov chain. I understand that the chances of transitioning from having it to not having it are 1-(probability of rain), and the chances of transitioning from having it to having it are (probability of rain). I cannot, however, figure out the other two transition probabilities. It seems like a very simple question, I'm just stuck. If someone could point me in the right direction, that would be great.

1

There are 1 best solutions below

0
On

Well, if he doesn't have it, then it must be at his other location, right? So the probability of transititioning from not having it to having it is 1, and the probability of transitioning from not having it to not having it is 0.