I am trying to prove the following estimation of the expectation of the maximum of independent poisson random variables. I've become interested in this problem while reading Joel A. Tropp's "An Introduction to Matrix Concentration Inequalities" (Section 5.1.2). The following estimation is used as a key step to show the optimality of the matrix Chernoff bound. I also believe this estimation is also useful in explaining sharpness of the matrix Bernstein inequality.
Desired estimation:
Let $X_i, i = 1, 2, \dots, n$ be independent Possion(1) random variables. Then, $$ \mathbb{E}\max_{1 \leq i\leq n}X_i \approx C\frac{\log n}{\log \log n}, $$ where $C$ is some constant.
Tropp used this fact without giving any proof or reference. But, after working on this for a while, it doesn't seem something trivial to me. After searching for a while, I found that the desired result seems to follow from the results by A.C.Kimber and C.W.Anderson.
However, I think it would be nicer if we can find more direct or an elementary way to obtain the desired estimation.
Any advice, suggestions or references would be very much appreciated!
This is just a sketch, but you might be able to turn it into a proof.
The CDF of the maximum of $n$ independent variables is the product of their CDFs, so if they’re identically distributed this is the $n$-th power of their CDF. As $n\to\infty$, we have $F(x)\to1$ and thus
\begin{eqnarray} \log F(x)^n &=& n\log F(x) \\ &\approx& n(F(x)-1)\;. \end{eqnarray}
Since the distribution of the maximum is very sharp, this has to be $\log\frac12$ near the mean. For the Poisson distribution with mean $1$,
$$ F(x)-1=-\mathrm e^{-1}\sum_{k=\lceil x\rceil}^\infty\frac1{k!}\;. $$
For $n\to\infty$, this is dominated by the first term, $-\frac1{\mathrm e\lceil x\rceil!}$. Setting
$$ \frac n{\mathrm ex!}\approx\log2 $$
and using Stirling’s approximation for the factorial yields
$$ x(\log x - 1)\approx\log n\;, $$
which leads to
$$x\sim\frac{\log n}{\log\log n}\;.$$