I am reading Erdős and Rényi's On a Classical Problem of Probability Theory, where they derive the asymptotic distribution for the Coupon Collector's problem of collecting at least one of every type of coupon (more generally, they solve it for $m$ of each type). In particular, they prove that for the general problem, if the first time is denoted by $v_m(n)$,
$$ \lim_{n \to \infty} \mathbb{P}\left( v_m(n) < n\log n + (m-1) \log\log n + x \right) = \exp\left( -\frac{e^{-x}}{(m-1)!} \right) , \quad x \in \mathbb{R} . $$
My question is about their remark on how this equation can be used to deduce that the expected value is
$$ \mathbb{E}[v_m(n)] = n \log n + (m-1) \log\log n + nC_m + o(n) , $$
and how it can also be used to deduce that the constant $C_m = \gamma - \log(m-1)!$, where $\gamma$ is the Euler–Mascheroni constant.
In the case $m = 1$, this fact follows from a simple computation of the expected value and the asymptotics of the harmonic series. But I am not too sure about the exact way you can use the asymptotic distribution to derive the expected value as the authors have remarked. Intuitively, I think it makes a lot of sense since this shows that $v_m(n)$ concentrates around $n\log n + (m-1)\log \log n + nC_m$ for some constant $C_m$, with $o(n)$ error.