Error bounds in Poisson limit theorem

73 Views Asked by At

Does there exist a version of the Poisson limit theorem which has error bounds for the CDF similar to the Barry-Essen central limit theorem?


In Vershynin's High Dimensional Probability, the Poisson limit theorem is stated roughly as follows

Let $X_{N,i} \sim \text{Ber}(p_{N,i})$ be independent. Assume that, as $N\to\infty$, \begin{align}\tag{1} \max_{i\le N} p_{N,i} = 0\mbox{ and }\lim_{N\to\infty} \sum_{i=1}^N p_{N,i} =: \lambda <\infty \mbox{ exists}. \end{align}

Then,

\begin{align}\tag{2} \lim_{N\to\infty} \left| \mathbb{P}\left\{ \sum_{i=1}^N X_{N,i} \le t\right\} - \mathbb{P} \{Z \le t\}\right| = 0. \end{align}

for every $t \in \mathbb{R}$ where $Z \sim \mbox{Poi}(\lambda)$.

I'm wondering if (2) can be improved to a statement of the form

$$ \left| \mathbb{P}\left\{ \sum_{i=1}^N X_{N,i} \le t\right\} - \mathbb{P} \{Z \le t\}\right|\le (\mbox{some specific function of $n$, etc. which $\to 0$}). $$

It seems that such a bound would necessarily depend on the rate of the convergence of the sequences in (1). Does such a version of the Poisson limit theorem exist?

Naturally, concentration inequalities like Chernoff's give good inequality bounds on, say, the probability that $\sum_{i=1}^N X_{N,i}$ is much smaller than its mean, but these concentration inequalities aren't helpful when one also wants a corresponding upper bound on the tail probability. All the Poisson limit theorem says is that, say if $p_{N,i} \equiv p(N)$, that if $\lim_{N\to\infty} Np(N) = \lambda$ then $\mathbb{P}\left\{ \sum_{i=1}^N X_{N,i} \le t\right\} = \mathbb{P}(Z\le t) + o(1)$, where we have no control over the error term $o(1)$. For many purposes, the $o(1)$ bound may be enough, but I'm wondering if there's a sharper version when the $o(1)$ bound fails.