Probability of not reaching completion in Markov process

69 Views Asked by At

This question is supposed to be easy but is very hard for me.

The Norwegian Skating Association has mass produced certain "collectors' cards" with all $N$ speedskaters (Norwegian as well as international ones). The cards are glossy and informative, with dramaturgically inspirational photo on one side and information about personal records and particular feats on the other, and are sold in our nation's kiosks for $1$ krone per card. When a person buys a car, however, he receives one from the kiosk's random collection of cards, and can not check in advance which card it is; he may therefore expect to have to buy considerably many more cards than $N$ in order to complete his collection of $N$ different cards.

We are now following one of our civilisation's card collectors, and let $X_n$ denote the number of different cards acquired after having bought $n$ cards (where it is assumed that he buys one card in turn). We shall presume that $\{X_1, X_2, \ldots \}$ is a Markov chain over the sample space $\{1, \ldots, N\}$, with transition probabilities \begin{eqnarray} \begin{matrix} p_{i + 1, i} = \Pr\{ X_{n+1} = i + 1 \mid X_n = i \} = 1 - i/N,\\ p_{i,i} = \Pr\{ X_{n+1} = i \mid X_n = i \} = i/N, \end{matrix} \tag 1 \end{eqnarray} for $i = 1, \ldots, N$. The chain starts at $X_1 = 1$, and the collection is complete as soon as $X_n = N$.

  1. Explain briefly which conditions need to hold in order that the Markovian assumption and the formulae $(1)$ may be seen as correct. Explain furthermore which of the states (if any) are transient, which (if any) are recurrent, and which (if any) are absorbing. $\color{red}{\text{What is the probability that his collection shall never reach completion}}$?

If you were supposed to answer that question how would you do it? Is it OK to say that all states beside $N$ are transient, and hence we must get to $N$ at some point? So the probability of not completing is $0$?