Modelling probability problems by Markov chains

207 Views Asked by At

For one of my courses, we have to think about how we could model certain problems with the help of Markov chains. Most are straight-forward, but I find it difficult to choose the right states and transitions to model the next two situations:

  • Two people pick three different numbers at random between 1 and 10. Construct a Markov chain (with 6 or more states) to determine the probability that they picked two or more identical numbers.
  • Set up a Markov chain to determine the probability that two or more persons have the same birthday among a group of n people.

In both cases, the purpose is not to calculate the exact probability, but to construct a Markov chain that would be suitable to determine the probability (possibly with the help of hitting probabilities, steady state distributions etc.). What would be a good/smart choice for the states and transitions for each of the problems?

1

There are 1 best solutions below

0
On BEST ANSWER

(1) You can think of it as a process by, at each step, allowing each player to choose their next number, so after any step, $n$, both players have $n$ numbers.

The states could then be the possible ordered pairs, $(n,m)$, of step number and number of matches so far:

\begin{align} S = \{& (0,0), \\ & (1,0),\quad (1,1) \\ & (2,0),\quad (2,1),\quad (2,2), \\ & (3,0),\quad (3,1),\quad (3,2), \quad (3,3)\}. \\ \end{align}

(2) You can think of this one as a process by starting with an empty set of people and then with each step adding one new person to it. We have one (absorbing) state, $M$ say, representing the case where the set contains at least one matching/shared birthday. The other states are $0,1,\ldots,n$, and here, step $k$ represents the case where the set has $k$ members and no matching birthdays. So,

\begin{align} S = \{M,0,1,\ldots,n\}. \end{align}