Different Cards Problem with Markov chains

37 Views Asked by At

Suppose we have a brand of poker which has $n(>1)$ different cards. We can get one random card every time when we open a pack of such poker. Let $X$ be the number of pack of poker which we need to open to collect all $n$ different cards.

We are asked to calculate the expectation and the variance of $X$.


There is a hint that, one can let $Y_t$ be the number of different cards he collected from the first $t$ packs and show it is a Markov chains.

My attempt:

Intuitively, $Y_t$ is a Markov chains since once we know the information on $Y_{t-1}$, we don't need the information on $Y_{t-2},...,Y_0$.

I also know, $E[X]=E[T]$, where $T$ is such that $Y_T=n$. I think $T$ should be something like the first passage time. But then I don't know how to proceed further.

I am self learning this theory and it indeed is a bit hard for a rookie to come up with a trick. Thanks for any help and hint.