Derive the asymptotic variance for ML estimator with BHHH (Poisson)

12 Views Asked by At

I tried to apply the BHHH method (outer products of gradients) to derive the asymptotic variance of the Poisson distribution:

$\hat{\text{avar}}(\hat{\lambda}) = \left( \sum_i s_{i}(\hat{\lambda}_{i}) \cdot s_{i}(\hat{\lambda}_{i})' \right)^{-1}$

with $s_{i}(\hat{\lambda}_{i})$ being the the term in the sum of the 1st derivative.

This yields: $\hat{\text{avar}}(\hat{\lambda}) = (\sum_i (1 - \frac{2x_i}{\hat{\lambda}} + \frac{x_i^2}{\hat{\lambda}^2} )^{-1}$

and

$\hat{\text{avar}}(\hat{\lambda}) = \left[N - \frac{2}{N}\left(\sum_{i}x_i\right)\left(\frac{N}{\hat{\lambda}}\right) + \frac{1}{N}\sum_{i}x_i^2\left(\frac{N}{\hat{\lambda}^2}\right)\right]^{-1} $

which reduces to (and which I think is correct):

$\hat{\text{avar}}(\hat{\lambda}) = 1/N \cdot \hat{\lambda}$

But that all depends the following fact which I had to assume: $\frac{1}{N}\sum_{i}x_i^2 = \hat{\lambda} + \hat{\lambda}^2$

Can you explain where that is coming from. I guess that is the estimated variance, right? In particular that confuses me because isn't that exactly what I am trying to derive with the BHHH method - an estimator for the variance? I guess I am mixing something up ...