Modelling problems using Markov chains.

113 Views Asked by At

A bucket contains four red and three green balls. The probability of picking a ball is equal. If a red ball is chosen, it is removed from the bucket and if a green ball is picked, it is placed back into the bucket. The game continues until all four red balls are removed from the bucket.

Model this game as a Markov chain and calculate the expected number of steps before the game finishes.


Now, I'm not quite sure how to draw a probability tree diagram using Latex, so I won't do so, but I can certainly model it like this but I am a bit confused when it comes to setting this up as a Markov chain.

My initial thoughts is that I will have two separate ones, depending on what colour ball I first pick. Is that the way to go about it? If so, I will edit this and show my attempt at it. If there indeed is a way to collate this as one Markov chain, then I would very much appreciate a pointer as to where to begin.

Thanks in advance.

1

There are 1 best solutions below

1
On BEST ANSWER

Each state will be the number of red balls remaining. The probabilities in the transition matrix will correspond to the probability that a ball chosen at random from what remains is red.