I have a binary variable. Given a row vector of explanatory variables $\boldsymbol {x_i}$, I estimate the probability of the explained variable being 1 as $F(\boldsymbol{x_i\beta})$. To estimate $\boldsymbol\beta$, I differentiate the loglikelihood: $$\frac{\partial \ln L(\boldsymbol\beta | \boldsymbol X,\boldsymbol y)}{\partial \boldsymbol\beta} = \sum_{i=1}^n \frac{f(\boldsymbol {x_i\beta})\boldsymbol{x_i}(y_i - F(\boldsymbol{x_i\beta}))}{F(\boldsymbol{x_i\beta})(1-F(\boldsymbol{x_i\beta}))}$$ where there are $n$ observations, $\frac{\mathrm d F}{\mathrm d \boldsymbol{x_i\beta}} = f$, and $\boldsymbol X$ represents the dataset matrix across individuals $i$ and variables. I need the covariance matrix of the estimators $\boldsymbol{\hat\beta}$, which is given as $$\mathrm{Var}\left(\boldsymbol{\hat\beta}\right) \approx \left(-\mathrm E\left[\frac{\partial^2\ln L(\boldsymbol\beta |\boldsymbol X,\boldsymbol y)}{\partial\boldsymbol\beta\partial\boldsymbol{{\beta '}}}\right]\right)^{-1}.$$ The notes I'm reading evaluate this to be $$\left(\sum_{i=1}^n\frac{(f(\boldsymbol{x_i\beta}))^2\boldsymbol{{x_i'x_i}}}{F(\boldsymbol{x_i\beta})(1-F(\boldsymbol{x_i\beta}))}\right)^{-1}$$ however, I want to evaluate this myself to see if I can do it.
My attempt: $-\frac{\partial^2\ln L}{\partial\boldsymbol\beta\partial\boldsymbol{\beta '}} =$ $$-\sum_{i=1}^n\left( \frac{\left(\frac{\mathrm df(\boldsymbol{x_i\beta})}{\mathrm d\boldsymbol{x_i\beta}} \boldsymbol {x_i}^2(y_i - F) - (f(\boldsymbol{x_i\beta}))^2\boldsymbol{x_i}^2\right)(F(1-F)) - \hspace{10em}\big(f\cdot\boldsymbol{x_i}(1-F) - F\cdot f(\boldsymbol{x_i\beta})\boldsymbol{x_i}\big)(f\cdot\boldsymbol{x_i}(y_i - F))}{(F(1-F))^2} \right) $$ Is this attempt correct, and can you show how to get the same answer as displayed in the notes?
Note that $\mathrm E \big[ y_i - F(\boldsymbol x_i\boldsymbol\beta) | \boldsymbol x_i\big] = \mathrm E [ y_i | \boldsymbol x_i ] - F(\boldsymbol x_i\boldsymbol\beta) = 0$, so wherever $(y_i - F)$ appears in your expression, it vanishes. The result follows.