Upper Bound on Expected Value of $n$ i.i.d. Poisson random variables.

531 Views Asked by At

Let $\{X_i \}_{i=1}^n$ be i.i.d. from a Poisson random variable with mean $\lambda$ and let $$M_n = \max_{1 \le i \le n} X_i.$$ What is a tight upper bound on $\mathbb{E}[M_n]$? I can prove that \begin{equation} \mathbb{E}[M_n] \le \frac{(\lambda+1) \log n}{\log \log n} + O \left( \frac{1}{\log \log n} \right) \end{equation} but numerically this bound is not tight. Can someone give a tighter analysis or point to a reference with one?

Proof of my bound:

Let $s > 0$. Then by Jensen's inequality, $$ e^{s \mathbb{E}[M_n]} \le \mathbb{E}[e^{sM_n}] \le \sum_{i-1}^n \mathbb{E}[e^{sX_i}] = ne^{\lambda(e^s-1)}$$ from the moment generating function. Then taking the logarithm of both sides gives $$ \mathbb{E}[M_n] \le \frac{\log n}s + \frac{\lambda(e^s-1)}{s}.$$ Finally, the minimum of the function on the right hand side should be around $s = \log \log n$ which gives the bound above.

2

There are 2 best solutions below

2
On BEST ANSWER

The paper https://arxiv.org/pdf/0903.4373.pdf mentions that as $n \to \infty$, $M_n$ becomes concentrated at two adjacent values located asymptotically at $\sim \log n/ \log \log n$ independent of $\lambda$ with probability approaching 1. Even tighter estimates are given there.

0
On

After the fact, but a quick observation: in your proof, set instead $$ s \stackrel{\rm def}{=} \log\left(\frac{\log n}{\lambda}+1\right) $$ Then $$ \mathbb{E}[M_n] \leq \frac{\log n}{s} + \frac{\lambda(e^s-1)}{s} = \frac{2\log n}{s} \operatorname*{\sim}_{n\to\infty} 2\frac{\log n}{\log\log n} $$ which is better than your bound for $\lambda >1$.