I'm working on a scenario in the coupon collector's problem where I draw $v_n$ coupons and look at the expected number of different coupons I get. Now if I set $v_n$ to the expected number of draws needed to get all the coupons I get something interesting. The problem statement goes:
Given n coupons, how many coupons do you expect you need to draw with replacement before having drawn each coupon at least once. I solved this by making a Markov chain, where you're on state $i$ if there are $i$ different coupons currently, and the transition probability is $\frac{n-i}{n}$ to go to $i+1$ or otherwise stay at $i$. The answer comes out to be: $$v_n=\sum_{i=1}^{n}{\frac{n}{i}}$$
Now given $k$ different cereal boxes, what is the expected number of different coupons I can get. Say $X_j$ is a random variable where, $X_j=1$ if coupon number $j$ is present, 0 otherwise.
$$E[X_j]=1-(\frac{n-1}{n})^{k}$$
Expected number of different coupons = $$E[\sum_{j=1}^{n}{X_j}]=\sum_{j=1}^{n}{E[X_j]}=n \cdot (1-(\frac{n-1}{n})^k)$$
If I set $k=\sum_{i=1}^{n}{\frac{n}{i}}$, and check how behind I am from getting all $n$ coupons, the expression I get is $$n-n \cdot (1-(\frac{n-1}{n})^{v_n})$$
For very large $n$ the value comes out to be $0.561459483568$, which I looked up and it equals $e^{-\gamma}$ where $\gamma$ is the Euler–Mascheroni constant. I sort of understand where the $\gamma$ came from looking at the Harmonic Series. But is there a better solution for this that's more intuitive?