Show that $I(\theta;X) = -\frac{\partial^2}{\partial\theta_i\,\partial\theta_j}\log c(\pi).$

74 Views Asked by At

Suppose that $$f_X(x\mid \pi) = c(\pi)h(x)\exp\bigg(\sum\limits_{i=1}^k\pi_i\tau_i(x)\bigg)$$ (an exponential family in natural parametrisation).

Exercise: Show that $I(\theta;X) = -\dfrac{\partial^2}{\partial\theta_i\,\partial\theta_j}\log c(\pi)$, where $I(\theta;X)$ is the Fisher information matrix.

What I've tried: $I_{i,j}(\theta;X) = \operatorname{Cov}_\theta\bigg(\dfrac{\partial}{\partial\theta_i}\log f_X(x \mid \theta), \dfrac{\partial}{\partial\theta_j}\log f_X(x \mid \theta)\bigg)$. Hence, if I'm not mistaken: $I_{i,j}(\theta; X) = \operatorname{Cov}_\theta\bigg(\dfrac{\partial}{\partial\theta_i}(\log c(\pi)) + \tau_i(x),\dfrac{\partial}{\partial\theta_j}(\log c(\pi)) + \tau_j(x)\bigg)$. I'm not sure how to proceed here, but I'm pretty sure that I'm not on the right track.

Question: How do I show that $I(\theta;X) = -\dfrac{\partial^2}{\partial\theta_i\,\partial\theta_j}\log c(\pi)$?

1

There are 1 best solutions below

0
On BEST ANSWER

We have that the formula for the Fisher information is:

\begin{equation} \begin{split} \mathcal{I}(\pi) &= -E_X[\frac{d}{d^2\pi}\textrm{log}f_X(x|\pi)]\\ & = -E_X[\frac{d}{d^2\pi}\textrm{log}c(\pi) +\textrm{log}h(x) + \sum_{i=1}^K \pi_i\tau_i(x)] \end{split} \end{equation}

Noting in the last expression that $\frac{d}{d\pi^2} h(x) =\frac{d}{d\pi^2}\sum_{i=1}^K \pi_i\tau_i(x)=0$, this gives us that the previous expression is: \begin{equation} -E_X[\frac{d}{d^2\pi}\textrm{log}c(\pi) +\textrm{log}h(x) + \sum_{i=1}^K \pi_i\tau_i(x)] = -E_X[\frac{d}{d^2\pi}\textrm{log}c(\pi)] \end{equation}

Which is constant with respect to $X$, thus the expectation goes away and we get:

\begin{equation} -\frac{d}{d^2\pi}\textrm{log}c(\pi) \end{equation}