Fisher information matrix for normal distribution

4.3k Views Asked by At

enter image description hereenter image description hereenter image description here

The below is captured from my lecture note, for the third column of first and second row and for the third row of the first and second column, is it because the summation of $x_i - \alpha - Bz_i$ equal zero so that these four entries equal to zero?

And why the entry of the third column of the third row will be equal to $2n/\sigma^2$ rather than what we got for the second derivative of $\sigma$?

Thanks a lot

1

There are 1 best solutions below

0
On BEST ANSWER

I think maybe that you're forgetting the (conditional) expectation in the definition of Fisher Information. One thing that maybe helped lead to this confusion is that the likelihood function in your notes is denoted $\ell(\boldsymbol{\theta})$ rather than $\ell(\mathbf{X};\boldsymbol{\theta})$.

The definition of Fisher Information is: $$ \mathcal{I}(\boldsymbol{\theta}) = \mathbb{E}(-\frac{\partial^2}{\partial \boldsymbol{\theta}^2}\ell(\mathbf{X};\boldsymbol{\theta})|\boldsymbol{\theta}) $$

We have $$ \mathbb{E}_{x}( \dfrac{\partial^2 \ell(\mathbf{X};\boldsymbol{\theta})}{\partial\alpha\partial\sigma} | \alpha,\beta,\sigma) = 0 $$ which is clear since $\mathbb{E}_{x_i}( (x_i - \alpha-\beta z_i) | \alpha,\beta,\sigma) = 0$ for all $i$. Likewise $ \mathbb{E}_{x}( \dfrac{\partial^2 \ell(\mathbf{X};\boldsymbol{\theta})}{\partial\beta\partial\sigma} | \alpha,\beta,\sigma) = 0$.

It's easy to show that: $\mathbb{E}(\frac{\partial^2}{\partial\sigma^2}\ell(\mathbf{X};\boldsymbol{\theta})| \alpha,\beta,\sigma)) = \frac{-2n}{\sigma^2}$.